SitePoint Sponsor

User Tag List

Results 1 to 8 of 8
  1. #1
    SitePoint Wizard Zaggs's Avatar
    Join Date
    Feb 2005
    Posts
    1,051
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    How to get around "Maximum execution time" error

    Hi Guys,

    I am receiving the following error;

    PHP Code:
    Fatal errorMaximum execution time of 30 seconds exceeded 
    The script I am using, copies a directory to a new location on the server. However, due to the size of the directory, the script always times out and gives the above error.

    The script;

    PHP Code:
    $dir opendir($src);
            @
    mkdir($dst);
                while(
    false !== ($file readdir($dir))) {
                    if((
    $file != '.') && ($file != '..')) {
                        if(
    $file == '.svn'){
                            continue;
                        }
                        if(
    is_dir($src '/' $file)) {
                            
    $this->BackupFiles($src '/' $file$dst '/' $file);
                        }
                        else {
                            
    copy($src '/' $file$dst '/' $file);
                        }
                    }
                }
            
    closedir($dir); 
    I know that it's possible to stop this error using set_time_limit at the start of the script. However, this may be disabled on some systems.

    Is there a better way to prevent this error?

  2. #2
    SitePoint Evangelist priti's Avatar
    Join Date
    Aug 2006
    Location
    India
    Posts
    488
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Can we use ini_set('max_execution_time' , time) ??

  3. #3
    dooby dooby doo silver trophybronze trophy
    spikeZ's Avatar
    Join Date
    Aug 2004
    Location
    Manchester UK
    Posts
    13,806
    Mentioned
    158 Post(s)
    Tagged
    3 Thread(s)
    If you are on a *nix box you could run a shell_exec command
    PHP Code:
    shell_exec("cp -r $src $dest"); 
    Alternatively you could read all the files into an array and split the array into blocks of 50, 100 or whatever and process each block in a loop.
    Mike Swiffin - Community Team Advisor
    Only a woman can read between the lines of a one word answer.....

  4. #4
    SitePoint Wizard Zaggs's Avatar
    Join Date
    Feb 2005
    Posts
    1,051
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by spikeZ View Post
    If you are on a *nix box you could run a shell_exec command
    PHP Code:
    shell_exec("cp -r $src $dest"); 
    Alternatively you could read all the files into an array and split the array into blocks of 50, 100 or whatever and process each block in a loop.
    Hi spikeZ,

    I am interested in your second suggestion. How could I split the array into blocks as suggested and how would this prevent the maximum execution time being reached?

  5. #5
    dooby dooby doo silver trophybronze trophy
    spikeZ's Avatar
    Join Date
    Aug 2004
    Location
    Manchester UK
    Posts
    13,806
    Mentioned
    158 Post(s)
    Tagged
    3 Thread(s)
    The execution starts when the script begins its process, if you split it into different processes it restarts the 'timer'.

    So for the particular project I was doing where this problem came up I wanted to process 5 or 6 folders of images recursively.


    Here is some sample code fo you to look over.
    PHP Code:
    function processImages($dirName$width$destination) {

        
    /** glob all the files into an array
            ** */
        
    $files glob($dirName.'/*');
        
        
    /** count the files
            ** in the target directory */
        
    $totalFiles count($files);
        
        
    /** sort the file
            ** */
        
    asort($files);

        
    /** calculate number of arrays required
            ** in this case splitting up into blocks
            ** of 50
            ** */
        
    $fpa 50;
        
        
    /** how many arrays needed?
            ** */
        
    $arrays_required floor($totalFiles/$fpa);
        
        
    $start 0;
        
    $display '';
        
        
    /** Slice the main array into blocks
            ** using array_slice
            ** */
        
    for($i=0$i<=$arrays_required$i++) {
            
    $array[$i] = array_slice($files$start$fpa);
                
    $start $start 50;
        }

        
    /** loop through the arrays
            ** one at a time
            ** */
        
    foreach($array as $key=>$data) {
             foreach(
    $data as $sKey=>$sData) {

            
    $x explode("/"$sData);

            
    $fNamePos count($x)-1;
            
    $fName $x[$fNamePos];
             
    $filename $fName;
            
            
    $this->convertImage($sData$destination$filename);
            
            
    $display $count++;
            
            }
        }
        
        
    $rDisplay '<b>'$dirName .' Processed '$display .' images</b><br />';
        return 
    $rDisplay ;
        

    Mike Swiffin - Community Team Advisor
    Only a woman can read between the lines of a one word answer.....

  6. #6
    SitePoint Wizard Zaggs's Avatar
    Join Date
    Feb 2005
    Posts
    1,051
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Hi spikeZ,

    Because there are such a large number of files and directories that I need to loop through, the first problem I am encountering is actually listing all the files.

    I am using the following function to list the files, but it's timing out like before:

    PHP Code:
    function ListFiles($src){
            
    // Get list of all files
            
    $dir opendir($src);
            @
    mkdir($dst);
            while(
    false !== ($file readdir($dir))) {
                if((
    $file != '.') && ($file != '..')) {
                    if(
    $file == '.svn'){
                        continue;
                    }
                    if(
    is_dir($src '/' $file)) {
                        
    $this->ListFiles($src '/' $file$dst '/' $file);
                    }
                    else {
                        
    $files[] = $file;
                    }
                } 
            }
            
    closedir($dir);
            return 
    $files;
        } 
    How can I overcome this problem?

  7. #7
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by Zaggs View Post
    I know that it's possible to stop this error using set_time_limit at the start of the script. However, this may be disabled on some systems.

    Is there a better way to prevent this error?
    Sounds like a script you want to distribute. You should prevent a single directory from getting too large, so that you can work in more manageable chunks. You can have the procedure split up into multiple http requests until you have completed, kinda like pagination. Use multiple directories, or store filenames in a database to make the the jobs easier to split up and resume.

  8. #8
    SitePoint Wizard Zaggs's Avatar
    Join Date
    Feb 2005
    Posts
    1,051
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by crmalibu View Post
    Sounds like a script you want to distribute. You should prevent a single directory from getting too large, so that you can work in more manageable chunks. You can have the procedure split up into multiple http requests until you have completed, kinda like pagination. Use multiple directories, or store filenames in a database to make the the jobs easier to split up and resume.
    What I am actually doing is taking a backup of the entire "public_html" directory of an account and yes the script will be distributed.

    My question at the moment is how can I list all the files inside the "public_html" directory without the script timing out after 30 seconds?


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •