I’ve been searching and searching… but could not come up with something interesting.
From within a script, I’d like to perform a couple of HTTP requests to external ressources. However, I’d like this script to execute as fast as possible and some of these external ressources may take up to a few seconds before sending a response. So I just would like to send data via HTTP, and don’t wait for any response.
Well, if the external script is a shell script written in either PHP, Perl, shell, etc. and you have the appropriate shebang line (plus the right file permissions), then you just use the path to the script.
/path/to/your/script.php
Otherwise, you can call the PHP/etc. interpreter and pass the path to the PHP/etc. file:
Well, you don’t have to use an absolute path. To call a file in the same current directory, you need to prepend the path with ./ to indicate that you actually want to execute it.
If those external scripts take a while to process, you shouldn’t call them from the same process as the one serving a http request.
The best way to deal with this, is to create a cronjob, that runs independently of your web-frontend, and then use a database table as create a queue for communicating between the two processes. (Or, if you don’t care about the result, just a one-way communication). This has a couple of positive side effects, such as the queue doubling as a logfile (Don’t delete entries, when they are processed - just mark them as done), and generally decouples unrelated responsibilities in you application.
–> When I call sleep.php url-style, the scripts does what it has to do. SO the problem is really located in the execInBackground() part.
@kyber: sounds great. However, to tell you the truth, I’m not sure I’d know how to do it. If you have time to elaborate a little, I’d be happy to learn from you.
You need the shebang line at the top of your file:
#!/usr/bin/env php
Second, sleep.php needs to be chmodded 0755.
You might also need to make sure that the script is using UNIX line endings.
Regarding kyberfabrikken’s suggestion, you would do it by having a cronjob task that runs at a specified interval. If you want a task done, you put an entry in a queue you store somewhere, such as in your database. The cronjob task, when it runs, checks if there’s anything in the queue that needs to be done and whether a task is already running. If there’s nothing to do, the cronjob task will just quit. Otherwise, if there’s stuff to do in the queue and nothing is currently being processed, it would perform a task from the queue.
You can also do it another way using a server modal where you have a server running all the time that takes requests for tasks and completes queued tasks until there is nothing left to do. If the server runs out of tasks to do, the server would just sleep. This would be ideal with a high amount of traffics and a server cluster setup.
The advantage of this method is that you won’t have fifty tasks running simultaneously because fifty people happened to access that particular webpage. If the case was video encoding, you’d have 50 video encoders going simultaneously, eating up all your precious CPU.
Again, nothing gets written on write.php. I don’t know how to check UNIX line endings, to tell you the truth.
About the cron, it sounds very interesting but I’m not sure it would be useful on this project.
Here’s more precisely what I’m trying to do: I have endless loops (php scripts) communicating with a remote server (we need to have them send data to this server every 60 seconds or so). I am now trying to switch on/off these endless loops from an admin CP. So I call a “loop handler” script which does various checks before executing the loop script it is told to execute or telling the loop to stop, depending on the state of the loop, of course (which is determined by the various checks the “loop handler” script does).
sk89q described it nicely. I’d like to add though, that a server model (daemon) is not something I would recommend as the first choice. PHP is notoriously bad at leaking memory, so long running processes can cause trouble. If you need to constantly process data, or you need it to process right away, it works better than a cronjob though. In both cases, using a database table as a queue works well. If you use a a database with locking support, you can make the setup safe for multiple processes, which can be helpful if you ever need to scale up on performance.