SitePoint Sponsor

User Tag List

Results 1 to 24 of 24
  1. #1
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    Question Continuing to run a script on the server after the user is redirected

    After a user uploads a video, it must be encoded. This takes quite a bit of time.

    I want to redirect the user after their upload is complete, allowing them to continue browsing the site. Meanwhile, the upload script continues to run on the server and encode the video, without the user having to remain on the upload page.

    How can I do this?

  2. #2
    SitePoint Zealot cholmon's Avatar
    Join Date
    Mar 2004
    Location
    SC
    Posts
    197
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    You'll probably want to build a video processing system that simply adds the raw file to a queue and redirects the user to a "thanks your video is being encoded" page, that way the encoding is asynchronous.

    the user could either be notified (via email, or maybe a simple on-site messaging system) when the encoding completes, or just have them check their "My Videos" section of the site to see the progress of their uploaded videos.

    the encoding itself could then be implemented in any language (php, python, java)...your "queue" could be as simple as a a tmp directory that an external process watches and encodes anything that shows up, then moves the resulting file to the appropriate directory for being served, and updates your site DB indicating whatever metadata is relevant (user ID, video title, etc).
    Drew C King: PHP Developer
    <?= $short_tags++ ?>

  3. #3
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Sounds like a good plan, but what external process would I use to watch the folder and run a php script whenever a file shows up?

    Could it be a cron job that runs a script checking the folder at intervals? ...or would that be less efficient than what you had it mind?

  4. #4
    SitePoint Zealot cholmon's Avatar
    Join Date
    Mar 2004
    Location
    SC
    Posts
    197
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Yeah a cron job might be the easiest way to kick off the queue processor, e.g., set it check for unprocessed files every minute.

    You could also use at to schedule the queue processor to run as a one-off, instead of over and over and having nothing to do...but I guess that depends on how often videos will be uploaded. You could have your upload script do a system call like at -f ~/videos/encode.php now + 1 minute as soon as the file has been successfully saved on the server. That command would tell your system to run the encode.php script - assuming it's executable - within a minute (actually when your system clock ticks over to the next minute). Just remember that it would be executed as the user that PHP is running as, e.g., nobody, httpd, apache, etc.

    Alternatively, you could write your own stand-alone daemon that runs in the background and watches the raw upload directory for any new video files and handles them appropriately. If you went that route you'd probably be better off using a language other than PHP...memory leaks can become a problem with long-running PHP processes.
    Drew C King: PHP Developer
    <?= $short_tags++ ?>

  5. #5
    SitePoint Enthusiast jameso's Avatar
    Join Date
    May 2002
    Location
    Melbourne, Australia
    Posts
    55
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I would use some kind of asynchronous job queuing and execution system.

    For now, the system might just be responsible for encoding videos. Later down the track, it could be used for other intensive tasks such as reporting, batch email sending, etc.

    For example, you could look at using Automattic's Jobs script (I think it's what they use behind the scenes on WordPress.com). Also see that page for the code for the WordPress.com video encoding solution.

    Alternatively you could use a Queuing service such as Amazon's Simple Queue Service to queue the messages, and then use a PHP daemon that is always running on your server, such as PEAR's System_Daemon or multithreaded php daemon to actually process the messages (ie encode the videos).

  6. #6
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    After the upload is complete, would it be possible to have the upload script check if the encoding script is currently running?

    If possible, the upload script could be the trigger to execute the encoding script (which encodes any file in a specified folder).

    When there are no more files left to be encoded, the encoding script will stop... until triggered again by an upload.

    This method seems like it would be very efficient. (if possible)

  7. #7
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    Arrow

    I have been trying to have my upload script redirect the user and continue running, but I just can't seem to get it to work. I have been spending many hours modifying the code below. Here is what I have as of right now:

    PHP Code:
    //Redirect user and continue script
                    
    header("Location: index.php");
                    
    ob_end_clean();
                    
    header("Connection: close");
                    
    header("Content-Encoding: none");
                    
    ignore_user_abort(true);
                    
    set_time_limit(1800);
                    
    ob_start();
                    
    header("Content-Length: ".ob_get_length());
                    
    ob_end_flush();
                    
    flush();
                    
    ob_end_clean(); 
    It seems to successfully redirect the user to the index page, but it still waits until the entire script is finished to do so.

  8. #8
    SitePoint Enthusiast jameso's Avatar
    Join Date
    May 2002
    Location
    Melbourne, Australia
    Posts
    55
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Hi Morthian,

    As far as I am aware you need some kind of asynchronous queuing system/daemon or a scheduled cron job for this (see my previous post for more details).

    Otherwise the user's browser will sit waiting for a response.

    James

  9. #9
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by jameso View Post
    Hi Morthian,

    As far as I am aware you need some kind of asynchronous queuing system/daemon or a scheduled cron job for this (see my previous post for more details).

    Otherwise the user's browser will sit waiting for a response.

    James
    I have seen many people talk about having their scripts redirect the user and continue running the same script, without the user having to wait for the entire script to finish.

    See the user comments here for one example:
    http://php.net/manual/en/features.co...n-handling.php

  10. #10
    SitePoint Guru Ruben K.'s Avatar
    Join Date
    Jun 2005
    Location
    Alkmaar, The Netherlands
    Posts
    693
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    psssssshh try this

    Code:
    header( "Location: http://www.google.com" );
    mysql_query( "INSERT INTO anytable VALUES( 'test' )" );
    (execution is NOT halted after redirection, by default anyway!)

  11. #11
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    A file will submit successfully if I close the browser during the script. This means the script continues to run without the user's presence.

    However, I still cannot seem to view other pages on the site until the script has finished its job. (at least not in the same browser)

    If I wait for the script, it will redirect me to the location specified, but it does not do so until the script has finished. (rather than BEFORE processing the code below the location header)

    The "redirect and continue" code in my script has not changed since I last posted it. [see 4 posts up]

  12. #12
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    If the browser immediately redirects to the other url, but it just doesn't load, this is probably a session locking issue. call session_write_close() before redirecting.

    You should still consider taking the advice the other posters gave you and just insert the job into a queue and let a daemon or cron job pick it up. You could also trigger a php script to execute as a background process via exec(). Ideally, don't have long running scripts run in your webservers. So write a php script that executes the cli php binary directly.

  13. #13
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by crmalibu View Post
    If the browser immediately redirects to the other url, but it just doesn't load, this is probably a session locking issue. call session_write_close() before redirecting.

    You should still consider taking the advice the other posters gave you and just insert the job into a queue and let a daemon or cron job pick it up. You could also trigger a php script to execute as a background process via exec(). Ideally, don't have long running scripts run in your webservers. So write a php script that executes the cli php binary directly.
    The browser doesn't immediately redirect; it waits for the entire script to finish first.

    I do plan on using cron, but I'd still like to know how to redirect a user and continue the script. I have a download script with the same issue, in which the user must wait for the entire file to finish downloading before they can visit another page.

    Also, how would I use exec to run a script in the background? I already use exec to run the ffmpeg commands. PHP waits for executed commands to finish before the user can visit another page.

  14. #14
    SitePoint Zealot cholmon's Avatar
    Join Date
    Mar 2004
    Location
    SC
    Posts
    197
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by Morthian View Post
    Also, how would I use exec to run a script in the background? I already use exec to run the ffmpeg commands. PHP waits for executed commands to finish before the user can visit another page.
    Here's an example of how you can use at:

    test.sh: Make this bash script executable by your web server's user...
    Code:
    #!/bin/sh
    date >> log.txt
    test.php: put this in the same dir as the shell script
    PHP Code:
    <pre>
    <?php readfile('log.txt'); ?>
    </pre>

    <?php
        
    if (isset($_GET['run'])) {
            echo 
    'adding test.sh to the at queue again...';
            
    exec('at -f test.sh now + 1 minute');
        }
    Then test it by hitting test.php?run=1 once, wait at least one minute, then hit test.php and you should see it print the contents of the log file (the date on each line).

    Just be sure that your web server's user is not listed in /etc/at.deny, otherwise a whole lotta nothing will happen.

    The atq will run the shell script as the web server user, so that script is where you'd put the call to your video encoder. at that point, the encoding process is being kicked off by atq, not by the web server, so it'll be it's own process, totally independent of apache.

    Also be careful not to allow any dirty data to be injected into your exec() call.
    Drew C King: PHP Developer
    <?= $short_tags++ ?>

  15. #15
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Thanks for the advice cholmon.

    My upload script processes all kinds of files. (audio, images, games, etc)

    After it is finished dealing with the main upload, it may also create a thumbnail, which is either based on the main upload (like a video frame snapshot) or a separately uploaded image. I don't know how resource-intensive this process is...

    Should I queue all types of uploads, or just those that need to be encoded. (i.e. audio , video)

  16. #16
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I am a little confused about the 'at' program...

    If an upload triggers the encoding script to execute while it is already running (encoding another file), will there be multiple instances of the encoding script running on the server, or will the script be queued to run once the other instance of it finishes?

    Also, can URL variables be used?
    example:
    PHP Code:
    exec('at -f _scripts/encode.php?id='.$submission_id.' now + 1 minute'); 

  17. #17
    SitePoint Zealot cholmon's Avatar
    Join Date
    Mar 2004
    Location
    SC
    Posts
    197
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by Morthian View Post
    I am a little confused about the 'at' program...

    If an upload triggers the encoding script to execute while it is already running (encoding another file), will there be multiple instances of the encoding script running on the server, or will the script be queued to run once the other instance of it finishes?
    The at queue simply holds commands to be run at some future time, in this case, within a minute. If you were to call at three time in a row, then three commands would be queued up, and when the next minute rolls around, those three commands would be executed. If they are all identical commands, then yes three separate processes would get kicked off one after the other. Whether each successive command knows about the others depends entirely on how you write the script. For the most part, it'll behave exactly as it you were logged into your server via SSH and you typed that at command into the command line.

    Quote Originally Posted by Morthian View Post
    Also, can URL variables be used?
    example:
    PHP Code:
    exec('at -f _scripts/encode.php?id='.$submission_id.' now + 1 minute'); 
    No, exec() simply executes an external program, so you'd need to pass parameters to your encoding script just like a regular command line program, see http://www.php.net/manual/en/features.commandline.php

    On a side note, a more recent thread in PHP Application Design suggests backgrounding an external program/script like this:

    PHP Code:
    <?php
        
    // ... handle upload ...

        // be VERY careful here, possible injection attack
        
    $submission_id = (int) $_POST['id'];

        
    // run the encoder, '&' puts it in the background
        
    exec('php -f _scripts/encode.php ' $submission_id ' &');
    The command that gets run would look like this if you did it manually on the command line:
    Code:
    php -f _scripts/encode.php 123 &
    So the encode.php script would need to examine $argv to find the submission ID.

    You'd probably be better off using that method instead of the at queue, honestly. It'll start running the encoder immediately instead of waiting for the next minute to tick by, but the overall effect will be the same. I originally used at in this manner a few years ago with a little custom virtual host control panel that needed to reload apache when a new site was created. I couldn't tell the server to reload itself though, b/c as soon as the command was run, the server immediately reloaded, killing the response and confusing the user ("every time i create a new site, it says page cannot be displayed!"). Using at was a convenient way to make sure I left enough time for the response to get back to the site admin, with a messages saying, "Your new site will be created within a minute".

    You don't have that kind of requirement though, so simply backgrounding the exec() call should suffice
    Drew C King: PHP Developer
    <?= $short_tags++ ?>

  18. #18
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    On a site where videos are constantly being uploaded, it would be better to have them encoded quickly one-by-one, right? (rather than allowing several videos to be slowly encoded all at once)

  19. #19
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Allowing potentially unlimited instances of the encoding script to run at the same time could hog a lot of memory and cpu if a lot of videos got submitted in a short window of time. But, its probable that your system could process more videos per minute if you allowed multiple encoders to run simultaneously.

    Use a job queue like suggested earlier, even if you just write your own simple one. You want to manage concurrency.

  20. #20
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Alright, this is what I have in my upload script right now:

    PHP Code:
    //...if submission needs encoding

    //Execute encoder
    exec("php -f encoder.php ".$submission_id." &");
                    
    //Redirect user
    header('location: index.php?page=submit&msg=1'); 
    This seems to cause the script to hang infinitely. I checked the error logs but didn't find anything.

  21. #21
    SitePoint Wizard bronze trophy
    Join Date
    Jul 2008
    Posts
    5,757
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    PHP Code:
    exec("php -f encoder.php ".$submission_id." > /dev/null 2>&1 &"); 
    exec() wants the output of the program you executed. It will wait until it reads to the end of the stdout and stderr streams, which won't happen until that program ends. so you need to redirect the output of the program to somewhere else..../dev/null is a black hole, but you could use a file if you want to capture the output.

  22. #22
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by crmalibu View Post
    PHP Code:
    exec("php -f encoder.php ".$submission_id." > /dev/null 2>&1 &"); 
    exec() wants the output of the program you executed. It will wait until it reads to the end of the stdout and stderr streams, which won't happen until that program ends. so you need to redirect the output of the program to somewhere else..../dev/null is a black hole, but you could use a file if you want to capture the output.
    Does this also redirect php errors to a black hole instead of the error log?

    The upload script successfully redirected me, but the encoding script must have failed because the submission never showed up. There is nothing in the error log.

  23. #23
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Ok, this just isn't working for me, and I can't figure out how to view any errors that might have occurred in the encoder script. =(

  24. #24
    SitePoint Addict
    Join Date
    Jan 2008
    Location
    Palm Harbor, FL
    Posts
    348
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Code:
    exec("php -f encoder.php ".$submission_id." > /dev/null 2>&1 &");
    PHP Code:
    //Convert video file
        
    if ($file_type == 'video/mp4') {
                      
            
    //Define temp file
            
    $temp_file '_temp_files/'.$submission_id.'_x264.mp4';
            
            
    //Run conversion
            
    exec('ffmpeg -y -i '.$filename.' -an -pass 1 -vcodec libx264 -vpre fastfirstpass -b 2048k -bt 2048k -threads 0 -f mp4 /dev/null && ffmpeg -i '.$filename.' -acodec libfaac -ab 128000 -ac 2 -ar 44100 -pass 2 -vcodec libx264 -vpre hq -b 2048k -bt 2048k -threads 0 '.$temp_file); 
    The encoder scripts seem to fail right here at the conversion; the file defined in $temp_file is never created. Any ideas why?


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •