Instant messaging with PHP - many short requests vs long polling

I’m about to implement an instant messaging feature in a PHP web application and I’m wondering which route to take regarding connecting to the server. This will obviously be done using ajax in the browser. I thinking about the two options:

  1. The browsers sends a request every 2 seconds to fetch any new incoming messages. To reduce server load I may set the interval to 10 seconds when users don’t send any messages and switch to 2 seconds once the conversation is started.

  2. Implement long polling - each request will take up to about 1 minute until a message comes or timeout occurs - then a new request is established.

I’m wondering which method will be lighter on the server. In option #1 there will be many more http requests but they will be short lived. In option #2 there will be fewer requests and less data to transfer but the number of simultaneous php scripts will be high. Currently I’m expecting about 20 people using the application at the same time but the number may increase to 50 or 100 in the next couple of years.

What is the overhead of a php connection? Will 100 or 200 simultaneous connections strain the server much?

method #3: WebSockets

1 Like

I’ve looked into WebSockets but that would require creation of a daemon on the server for managing the connections and I may not have the possibility to do this. If the demand grows in a few years I’ll probably have to switch to sockets but for now I want to stay with plain http.

You can use HTML5 server-side events (the EventSource object) http://www.html5rocks.com/en/tutorials/eventsource/basics/ although the implementation is really just a wrapper for short polling.

Long polling is worse for the server because apache has a connection limit (usually 256) which means that if you have 256 people long polling your server stops responding to more requests. There are ways around this, but short polling is generally the safer option.

edit: And for something like this you can really reduce server load: Each time someone sends a message, write to a .json file. Poll the static .json file and you can serve the request with NGINX, bypassing apache and PHP entirely. Of course this opens up some potential security issues.

Interesting, I haven’t heard of this before but it looks like it’s called Server-Sent Events and to me it looks like a variant of long polling because it needs a long running server-side script. And no support for IE…

Indeed, this might be a problem for apache. But I’ll see if I can use lighttpd or nginx instead as they are reported to allow a much larger number of connections. And still there’s a problem of many database connections but I think I could handle that with a separate persistent connection.

That’s a good idea, indeed! I think the security problem could be solved with random paths to each json file.

When I tried it ( for this: https://github.com/TomBZombie/NoF5/blob/master/src/nof5.js ) it never worked that way, and Chrome sends a request every second. Perhaps it wraps both long and short polling depending on whether the server closes the connection or not.

This is also what I understood from the specs.

On second thoughts I think it will be indeed less demanding on the server to short poll, especially when targeting a static json file. Static file access is pretty cheap so the server will be able to handle many of them. With long polling I’d have to make a loop with sleep() in php because php has no event mechanisms so I have no way of knowing that a new message has appeared other than periodically checking some data store. That would actually mean creating short polling on the server side because that’s what the loop would actually be doing - and eating server resources even if not necessary. If I had an event system in php I might go for the long polling solution.

The only downside to short polling is the repeated unnecessary http requests over the network but overall I think it’s the lesser evil.

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.