Server Performance metric? How many requests per second? Please?

Hi everyone, Im setting up a db driven website with lots of dynamically changing data, and pretty much ZERO static web pages.

I just got my first dedi server up and running, and was wondering, in terms of “How many requests per second” that my server can deal with, what is reasonable figure to have?

I am confused because:?

  1. Lots of people say “Oh, if u have a static web page, u can serve 1000 requests per second, and survive a slash dot effect”

  2. Or lots of people say “Oh, dont worry, a single box can handle a busy forum site with 1000 simultaneous users.”

The problem is

  1. Im not serving Static Pages, and currently im not getting ANYWHERE EVEN NEAR 1000 requests per second.
  2. 1000 simultaneous users on a forum is a pretty useless metric, because it doesnt really tell how much ur server can serve! You could have 1000 simultanenous users in a forum, but on average, only 10 load a page every second. So that would mean the server would only need to handle 10 page requests a second.

So, does anyone have any idea what is a reasonable amount of pages served a second, that a single dynamic server should be able to handle?

Just ball park figures please… Like 1 page a second, 10 page a second, etc etc.


There are so many factors to consider that this can’t even be ballparked. The only way to find is to do real testing against your specific application.

how many requests a second would a site like “Digg” or “” need to serve?

Assuming its like 20 requests a second, I’d need to replicate my servers 4 times (if each one can do 5 requsests a second).

I guess thats confusing me. How many requests per second would getting “Slashdotted”, for example, effectively require?

Are talking like 1 request a second, 10/secound, 500 / second? I have no intuition whatsoever reagarding this…

Thanks so much for u rhelp tho…

It depends how many resources a webpage takes to serve. If your page requires 100 database queries, you’re probably not going to serve as many simultaneous pages as a static site with none. You may not be able to serve 1 page per second, or you may be able to serve thousands. When I said there was no way to ballpark this I meant it. You need to talk about a specific application (like a website you created) and test it.

Few definitive numbers [2][3] [4] exist regarding the precise magnitude of the Slashdot effect, but estimates put the peak of the mass influx of page requests at anywhere from several hundred to several thousand hits per minute.

thanks all for the great advice!!

my site is going live soon…

woah ~x00 to X000 per minute would mean server needs to churn out 10-100 reqs a second… yikes…

100 reqs a second and my simple P4 3.0 server doesn’t flinch. It depends what kind of request you’re talking about.

I have some pages that take 2-3 seconds to generate, others less than 5 microseconds.

Hi dan, thanks for the reply.
my site uses tagging extensively, so thats where most of the DB hit comes from. Tags, related tags, tag intersections, etc.

You are right, some queries take like 1 microsecond, so if a page had 10 of those, would take 10 microseconds (100 page loads a Sec.)

But sadly, most of the pages require heavy db lifting.

Im definitely going to have to rewrite the schema to scale better. But that will take time.

BTW: What script are you using that shows your QPS?


The following links might help you stress-test your server AND application so long as you can take the time (or delegate it to someone else) to set it up and run it properly: