probabilmente non mi sono spiegato bene
lo script continua l’esecuzione ho impostato set_time_limit(0);
ma il browser resta in attesa questo è il mio problema
non ho capito come dire al browser che “l’output è finito”
I probably would not have explained well
continues to run the script I set set_time_limit (0);
but the browser will wait this is my problem
I have not figured out how to tell the browser that “the output is over”
Don’t believe there are any similar functions for other sapi.
I know MediaWiki has code to process jobs after the page has sent, so maybe worth having a look at that. Though ideally I’d think the ideal solution would be to use a queue (like gearman) so you can process things asynchronously.
I would personally use vbulletin’s approach to psudo cron jobs. Using session or some similar mechanism, record the information you need the page to work with. Send the output to the browser and include an img tag that will try to load an “image” to the browswer. This tag will trigger the rest of the process and the user will be none the wiser since the loading progress bar will not spin on most browser while the image is working. In any event, you set the process to continue even if the user cancels using ignore_user_abort();
This is an interesting trick! Usually browsers will still indicate the page is being loaded when an image is being fetched - unless it’s a background image in a gecko browser. But I think you should be able to use javscript to auto-cancel image loading after a few seconds - for example by removing the image from DOM or setting its src to an empty string.
A more reliable trick to make pseudo-cron jobs might be to do it in PHP using curl. This would involve making a request (reading) of a page with your php script on the same server. You would need to provide the full url with http:// but if the url points to the same server the request happens almost without any delay. The main idea would be to set a one-second timeout on the request so your main script can terminate after 1 second while the separate script, which has just been invoked by curl from the outside, can work for a long time. What I mean is something like file_get_contents() with a very short timeout. Causing a one-second delay every hour or even 5 minutes for just one request would be no problem.
It might be just me, but wouldn’t using ob_flush() or similar do?
I think what the OP wants to do is using something like ob_flush(), write to the browser, flush, write some more, flush and then close. Upon closing the request will still continue in the background (won’t it?)
This won’t do - the request will continue in the background but will not be closed and the browser will still indicate the page is loading. ob_flush() will leave the connection open because, potentially, you can output more data to the browser so the browser needs to continue listening. The OP wants to close the connection but continue with the script.
I admit, I did not know about fastcgi_finish_request() before. After reading this thread I experimented with it here is how I made it work:
I wrote a little post on my blog, read it, learn it, thank me later