SitePoint Sponsor

User Tag List

Results 1 to 14 of 14
  1. #1
    SitePoint Zealot
    Join Date
    Dec 2006
    Location
    England, UK
    Posts
    160
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    cron job backup of updated files

    hey, basically what i want is for a script to run maybe every 30 minutes (or 30 seconds?) to check my hosting account for updated files

    if it finds any, then it should make a copy of it within a 'backup directory' ..

    im guessing a mixture of FTP, PHP and cron jobs would be an effective combination?
    bandwidth/server space is not an issue

  2. #2
    SitePoint Addict ALL's Avatar
    Join Date
    Oct 2005
    Location
    South Dakota
    Posts
    215
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    not sure what you are asking... if you are asking somone to do it for you, then you asked in the wrong forum, there is a jobs forum for that... if you are asking for help, please be more spacific...

    Yes, Ftp, PHP, and cron jobs would be effective in what you want from the sounds of it.
    Did I help you?
    You can repay me, support one of my projects (no money needed):
    JavaScript Wiki, Another Web Forum, Paranormal Site

  3. #3
    SitePoint Zealot
    Join Date
    Dec 2006
    Location
    England, UK
    Posts
    160
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    no, i dont want anybody to do the script for me - i simply want help with knowing where to start and what would be the best method to use?

    atm, i plan for it to open the ftp connection, display all files within the public_html directory and order by last modified
    any that had been modified between then and the last time the script had run would be copied to a separate directory

    sounds simple eh?

  4. #4
    SitePoint Zealot
    Join Date
    Dec 2006
    Location
    England, UK
    Posts
    160
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    i just need somebody to explain how to get a list of all files within a public directory and all subdirectories (ordered by last modified date/time) via FTP/PHP

    from there, i just need to implement this via a cron job or other automated/triggered event and then its complete i guess

    (if you cant be bothered to help code, at the very least tell me where to look)

  5. #5
    SitePoint Addict ALL's Avatar
    Join Date
    Oct 2005
    Location
    South Dakota
    Posts
    215
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    well i recently made a program that collects an idx through ftp, the hardest part is makeing sure it will work everytime without error. i would start with the ftp part of it. (if you don't already have it here is the link to php's ftp functions: here)

    From there you can atleast minipulate the files, and have something to work with. When i made my idx thing that is what i did,
    -ALL
    Did I help you?
    You can repay me, support one of my projects (no money needed):
    JavaScript Wiki, Another Web Forum, Paranormal Site

  6. #6
    ✯✯✯ silver trophybronze trophy php_daemon's Avatar
    Join Date
    Mar 2006
    Posts
    5,284
    Mentioned
    2 Post(s)
    Tagged
    0 Thread(s)
    It can be done without the use FTP if you prefer:
    PHP Code:
    $time=(int)trim(file_get_contents("lastexec.txt"));

    chdir("uploads");
    $files=glob("*.*");
    foreach(
    $files as $file){
      if (
    $time filectime("uploads/".$file) < 0copy("uploads/".$file,"backup/".$file);
    }

    file_put_contents("lastexec.txt"time()); 
    file_get_contents, file_put_contents, trim, chdir, glob, filectime, copy
    Saul

  7. #7
    SitePoint Zealot
    Join Date
    Dec 2006
    Location
    England, UK
    Posts
    160
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    thanks ALL -- im gonna look into that and maybe finish an ftp site i started a while ago =D

    thank you php_daemon, as far as i can see, that does a single directory without searching deeper levels?
    although, im unsure about the glob("*.*") part -- i assume to search for filenames with a '.' and wildcharacters eitherside??
    i greatly appreciate the help you have both given, thank you

  8. #8
    ✯✯✯ silver trophybronze trophy php_daemon's Avatar
    Join Date
    Mar 2006
    Posts
    5,284
    Mentioned
    2 Post(s)
    Tagged
    0 Thread(s)
    Ah, really sorry, that should be glob("*") to catch all files.

    And no, it just scans one level. As a matter of fact it treats the subdirectories as files, hmm... let me try another one:
    PHP Code:
    function check_dir($dir$root=""){
      global 
    $time;
      
    chdir($dir);
      
    $files=glob("*");
      foreach(
    $files as $file){
        if (
    is_dir($file)) check_dir($file$dir."/");
        elseif (
    $time filectime($file) < 0copy($file,"backup/$root$dir/$file");
      }
    }

    $time=(int)trim(file_get_contents("lastexec.txt"));

    check_dir("uploads");

    file_put_contents("lastexec.txt"time()); 
    This should scan the entire hierarchy under a given directory.
    Last edited by php_daemon; Jan 6, 2007 at 18:42.
    Saul

  9. #9
    SitePoint Zealot
    Join Date
    Dec 2006
    Location
    England, UK
    Posts
    160
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    hmm .. so what do i set $root and $dir to?
    the file structure is
    /home/kwah/homedir/public_html/(all other files which i want to search through)

    the highest level i can access is homedir/ downwards

  10. #10
    ✯✯✯ silver trophybronze trophy php_daemon's Avatar
    Join Date
    Mar 2006
    Posts
    5,284
    Mentioned
    2 Post(s)
    Tagged
    0 Thread(s)
    All you need is set a starting point, for example:

    check_dir("/home/kwah/homedir/public_html/upload");

    Note: I've edited the function so that you don't need to pass the empty string as root. Just pass the starting directory.

    Though now that I come to think of it, it will fail copying the files from subdirectories as there are will be no such subdirectories in backup initially.

    OK:

    PHP Code:
    $backup="/home/kwah/homedir/backup"//path to backup directory

    function check_dir($dir$root=""){
      global 
    $time$backup;
      
    chdir($dir);
      
    $files=glob("*");
      foreach(
    $files as $file){
        if (
    is_dir($file)) check_dir($file$dir."/");
        elseif (
    $time filectime($file) < 0) {
          if(!
    file_exists("$backup/$root$dir")) {
            
    $subdirs=explode("/","$root$dir");
            
    $path="$backup/";
            foreach(
    $subdirs as $subdir){
              
    $path.="$subdir/";
              
    mkdir($path);
            }
          }
          
    copy($file,"$backup/$root$dir/$file");
        }
      }
    }

    $time=(int)trim(file_get_contents("/home/kwah/homedir/lastbackup.txt"));

    check_dir("/home/kwah/homedir/public_html/");

    file_put_contents("/home/kwah/homedir/lastbackup.txt"time()); 
    Saul

  11. #11
    Follow Me On Twitter: @djg gold trophysilver trophybronze trophy Dan Grossman's Avatar
    Join Date
    Aug 2000
    Location
    Philadephia, PA
    Posts
    20,580
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    Rather than reinvent the wheel in PHP, and waste bandwidth, why not just rsync the two directories? Rsync should be ready and waiting on virtually any UNIX/Linux box.

  12. #12
    SitePoint Zealot
    Join Date
    Dec 2006
    Location
    England, UK
    Posts
    160
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    thanks php_daemon -- ill give it a try asap


    hmm dan .. would you care to explain further? considering i dont have any access to server configuration

  13. #13
    SitePoint Member
    Join Date
    Jan 2006
    Posts
    17
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    how about a desktop app? if it's just for backup a tool like sitevault will backup everything locally, create restore points, copy only the new / updated files and db...

    george
    Mysql + FTP Website Backup from your desktop. Site-Vault.com

  14. #14
    SitePoint Zealot
    Join Date
    Dec 2006
    Location
    England, UK
    Posts
    160
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    sorry naw - i dont own a pc and its a hosting website, to which i dont have server access


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •