Site down due to heavy traffic? How do you prevent this?

I watched this link being promoted on facebook in the comments section of an article. It has a good amount of likes and was a top comment. On visiting the link I found that he website was not accessible due to heavy traffic and there was mention of some kind of downtime. Why does this happen? and how can one prevent this from happening to their website?

It happens when the number of people trying to access the site exceed the available bandwidth.

The bandwidth limit is usually a figure set by your hosting provider and you can often increase the limit available to you by buying more. If you need so much that one provider is not able to meet your needs then you need your site hosted on multiple servers in different locations and use load sharing to share the bandwidth your visitors need between the different servers.

A literally infinite number of ways. It entirely depends on the site, how it’s built, what it’s for, and everything else. But you can do things to build a site to scale which can help with this, such as more hardware, more servers, reducing database queries, moving pieces of the database to a cache layer, lowering server requests, reducing web server caching, etc. etc. It’s a pretty broad topic.

Most blogs on most hosts with decent plans can usually handle almost anything legitimately thrown at them.

@mawburn,

I am setting up a VPS for one of my clients. It has 2GB of RAM and I think 2TB of Bandwidth.

Assuming a simple website like SitePoint (i.e. mainly text, not lots of images), how many concurrent users would such a setup handle on average?

That isn’t enough information. What is the average size of each page on your website? 5 KB, 50 KB, 200 KB?

:blush: Good question! How would I figure that out? (Sorry to be so clueless - but I just build websites - I have little knowledge of running them!)

Easiest way, open Developer Tools, go to the Network tab and refresh the page. The total bytes/kilobytes transferred for the page load will be at the bottom of the Network tab.

So let’s say it is 50 KB. You have 2 TB of bandwidth. So the first thing you need to do is convert one of those two numbers into the identical measurement.

2 TB = 2048 GB = 2097152 MB = 2147483648 KB

50 KB goes into 2147483648 KB 42,949,672.96 times. That’s a lot. So you can support that many people visiting your page in a given month and thus use the entire 2 TB of bandwidth you have.

Now that’s a very basic understanding of it. There are other factors that could play apart dependent on how your host is setup, but that should give you a good idea of how it works.

1 Like

@cpradio,

Thanks, that was a neat calculation! However, I was asking about, “How many concurrent users could my VPS handle?” (You told me how to calculate how many users it would take to eat up my Monthly Bandwidth…)

I did a check, and my Home Page is 90KB, a Section Page is 140KB, and an Article w Comments was 265KB.

Same idea, same concept. It simply involves your RAM which is harder to gauge. If serving a single page uses X amount of RAM, you do the same math.

1 Like

RAM is very hard to gauge and you pretty much just need to do it per site basis.

For example, PHP uses no RAM when it’s not in use, but when it is in use it could use more RAM than something like Java which uses a lot when it is not in use but doesn’t need much per user. Further on that WP might use more per user than Laravel and a single .php is going to use less than both.

If you’re hosting your Database on the same server as your site, then that’s going to use RAM too… sometimes a lot. It depends on how it’s configured obv. Or any other services or programs you have running on the server to help your site. Each takes it’s own little piece.

It’s kind of like asking “How many pieces of candy can you fit in a jar?” If it’s a big jar and a small pieces of candy, then probably a lot… but I’m not really sure and you can really make an educated guess without lots of details. The best and easiest way, would be to just count how many pieces you can put in the jar by putting them in the jar.

1 Like

Also, I’d tell you to stop worrying about the RAM usage. You will know when it becomes a problem. The site will grind to a halt, response time will drag to terrible levels, and you’ll see that in any performance monitoring software you may implement (really anything that can monitor page load times).

From there, you simply use htop or top to view the memory and CPU usage in real time as during those peak hours and start to research why that is happening. More visitors? Cron jobs are also running? There is some batch processing happening at the same time? You were importing data? Exporting data?

Once you sort of know the why, you start to figure out if you need to expand the Memory you have available or rewrite key pieces of the software to be more mindful of memory usage.

1 Like

All very valid points, but there must be a rule of thumb based on the specs I gave.

For example, if you took your best guess, would you say it could handle 5 concurrent users? 50? 500? 5,000?

Even with all of the unknowns you mention, there has to be a generally accepted range of Hardware-to-Concurrent-Users.

For example, if you asked me, “How fast will John’s 2015 Mustang go?” then I wouldn’t come with, “Well it depends on the type of cam shaft he has…”

Based on experience and Physics, I would guess, “It probably tops out at between 120-140 MPH…”

Right, but it is also about setting expectations for my client who has a shoe-string budget.

My client might expect that for $50/month for a VPS that he can handle thousands of concurrent users and make the big bucks! When in reality our new VPS could choke after 5 of his friends start surfing!!! (I have no clue what the range might be, and am trying to set both of our expectations!)

What is htop and top??

True, but if you think you can handle 500 concurrent users when you can only handle 5, then you’ll be a dead-man before you ever get to those issues…

Can’t. Too many unknown variables. If I had your code and was given ample time, sure, I probably could give you a good number.

Load test the site. When it starts to crawl, you reached your limit.

htop and top

No you aren’t. You discovered that you have a problem and my advice still stands. You will one day hit your limit (most companies do anyway), and you’ll have to figure out why you hit it by looking around. If it takes 5 users to get there, you’ll know right away on day one.

I looked at the ApacheBench article. Made no sense. I have no clue what Laravel is or how it relates to my clients site. I also saw OOP at the end and that is enough to make me run.

Are there any more “practical” load-testing solutions out there for free?

Yes, check the site’s error logs.

Without more details it’s impossible to give any helpful answers.
But if you post a link to the site others more experienced in using dev tools should be able to spot potential problem areas.

*I noticed nobody mentioned using a CDN to reduce calls to the site’s host server for static files.

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.