"closing" page but continue elaboration

hy all
My script need to show some output to the browser and then “close the page”
so the browser goes in state “completed”

but I still need to proceed (only server side) with the execution of the script

example

<?php
echo(“some output”);
//method to “close” the connection to the browser

//more statements
my_function();

//etc…

?>
script completed.

Please note I don’t want to stop output or using output buffering
just signal to the browser that the page i scompleted

thanks for any suggestion.

http://php.net/manual/en/features.connection-handling.php

grazie,
probabilmente non mi sono spiegato bene
lo script continua l’esecuzione ho impostato set_time_limit(0);
ma il browser resta in attesa questo è il mio problema
non ho capito come dire al browser che “l’output è finito”

Edit:

English translation:

thanks,
I probably would not have explained well
continues to run the script I set set_time_limit (0);
but the browser will wait this is my problem
I have not figured out how to tell the browser that “the output is over”

If using php-fpm (fastcgi) then see fastcgi_finish_request() http://php-fpm.org/wiki/Features#fastcgi_finish_request.28.29

Thx Ren it should be the solution but I’m running php as module in this configuration there is an option to use fastcgi_finish_request() or similar function ?

in your experience is it complex to configure php running as cgi ?
do I have to adapt or modify my scripts ?

Don’t believe there are any similar functions for other sapi.

I know MediaWiki has code to process jobs after the page has sent, so maybe worth having a look at that. Though ideally I’d think the ideal solution would be to use a queue (like gearman) so you can process things asynchronously.

Just to give you another option (Albeit a potentially messy one) you can use header('location: ') to forward to another page. The user should see the next page while the last one is still executed.

Not a particularly clean solution though.

I would personally use vbulletin’s approach to psudo cron jobs. Using session or some similar mechanism, record the information you need the page to work with. Send the output to the browser and include an img tag that will try to load an “image” to the browswer. This tag will trigger the rest of the process and the user will be none the wiser since the loading progress bar will not spin on most browser while the image is working. In any event, you set the process to continue even if the user cancels using ignore_user_abort();

This is an interesting trick! Usually browsers will still indicate the page is being loaded when an image is being fetched - unless it’s a background image in a gecko browser. But I think you should be able to use javscript to auto-cancel image loading after a few seconds - for example by removing the image from DOM or setting its src to an empty string.

A more reliable trick to make pseudo-cron jobs might be to do it in PHP using curl. This would involve making a request (reading) of a page with your php script on the same server. You would need to provide the full url with http:// but if the url points to the same server the request happens almost without any delay. The main idea would be to set a one-second timeout on the request so your main script can terminate after 1 second while the separate script, which has just been invoked by curl from the outside, can work for a long time. What I mean is something like file_get_contents() with a very short timeout. Causing a one-second delay every hour or even 5 minutes for just one request would be no problem.

Edit:

Apparently you can do it without curl:


$ctx = stream_context_create(array(
    'http' => array(
        'timeout' => 1
        )
    )
);
file_get_contents("http://example.com/path/to/my/pseudo-cron.php", false, $ctx);

Very interesting. Source code is probably here:

It might be just me, but wouldn’t using ob_flush() or similar do?

I think what the OP wants to do is using something like ob_flush(), write to the browser, flush, write some more, flush and then close. Upon closing the request will still continue in the background (won’t it?)

i.e.


<?
    echo "Starting<br />";
    ob_flush();
    echo "Doing something<br /><script type='text/javascript'>window.onload = function() { self.close(); }</script>";
    ob_flush();
    
    // do something time consuming
?>

hth

This won’t do - the request will continue in the background but will not be closed and the browser will still indicate the page is loading. ob_flush() will leave the connection open because, potentially, you can output more data to the browser so the browser needs to continue listening. The OP wants to close the connection but continue with the script.

Rather than hack something together (e.g. pseudo cron jobs, or playing with connection handling), why not use something meant for this task, such as a job queue/server?

We use Gearman successfully for just this kind of problem (and a lot more besides) - http://php.net/manual/en/book.gearman.php

You can write workers in PHP, pass context back to them from a client and it all “just works”.

I admit, I did not know about fastcgi_finish_request() before. After reading this thread I experimented with it here is how I made it work:
I wrote a little post on my blog, read it, learn it, thank me later