SitePoint Sponsor

User Tag List

Results 1 to 7 of 7
  1. #1
    SitePoint Zealot
    Join Date
    May 2005
    Posts
    172
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    Help Saving Images from Server to Server

    Basically I want to grab about 1500 images from an old server which a client cant remember the ftp details of. I have something working with cURL but it keeps timing out after before all the images are saved.

    Whats the best way of doing this sort of stuff?

    Code I have at the moment:
    PHP Code:
    ...
    private function 
    _save_image($img$fullpath)
    {
        if ( 
    function_exists('curl_init') && function_exists('curl_setopt') )
        {
            
    $ch            curl_init($img);
            
    curl_setopt($chCURLOPT_HEADER0);
            
    curl_setopt($chCURLOPT_RETURNTRANSFER1);
            
    curl_setopt($chCURLOPT_BINARYTRANSFER,1);
                
            
    $rawdata    curl_exec($ch);
            
    curl_close ($ch);
                
            if ( 
    file_exists($fullpath) )
            {
                
    unlink($fullpath);
            }
                
            
    $fp            fopen($fullpath,'x');
                
            
    fwrite($fp$rawdata);
            
    stream_set_timeout($fp180);
            
    fclose($fp);
        }
        else
        {
            
    trigger_error('cURL is not installed'E_USER_ERROR);
        }
    }
    ... 

  2. #2
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    set_time_limit() or run it from the command line. You may consider breaking it up into smaller chunks of work.

    ps - copy()

  3. #3
    SitePoint Zealot
    Join Date
    May 2005
    Posts
    172
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    How would I go about breaking it up into smaller chunks?

  4. #4
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    For example, download 5 images per script execution. If you want this to run through the webserver instead of via command line

    PHP Code:
    <meta http-equiv="refresh" content="1;url=http://example.com/script.php?start=<?php echo $_GET['start'] + 5?>" />

  5. #5
    SitePoint Zealot
    Join Date
    May 2005
    Posts
    172
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    That seems a bit hacky, is there any other batch processing methods?

  6. #6
    Theoretical Physics Student bronze trophy Jake Arkinstall's Avatar
    Join Date
    May 2006
    Location
    Lancaster University, UK
    Posts
    7,062
    Mentioned
    2 Post(s)
    Tagged
    0 Thread(s)
    Cron Jobs are probably what you're looking for.

    Though I find this a bit of a waste if it's a one-time run. I also think that using Curl is a bit of a waste of processing when the normal file functions would do fine - for example, the copy() function crmalibu suggested.

    I think a better solution, if you're only going to run this once anyway, would be to write a simple console application in whatever language you want (C#, Java or Python would probably be the most suitable) and download all the files, then upload them via FTP. That'd take under 5 minutes to write the program, then it's just a case of waiting for the images to download and upload.
    Jake Arkinstall
    "Sometimes you don't need to reinvent the wheel;
    Sometimes its enough to make that wheel more rounded"-Molona

  7. #7
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I think running it through the webserver is hacky for this purpose. That's why I suggested using php from the command line. script timeouts are now a thing of the past.


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •