Super-fast-loading websites

The topic of fast-loading web pages is a popular one these days. It’s pretty impressive when a page loads in a flash. For example, try something like the GitHub Pages site. Boom—it loads in a flash, no matter where you are in the world, according to the Sucuri load time tester:

There’s been a lot written—in articles and books—on optimizing load times, including topics like gzipping, minifying files and so on, inlining critical CSS. But I’d be interested in what people have found to be most effective in speeding up their sites.

In the screenshot above, you can see that GitHub Pages serves up from different IP addresses around the world—which I assume means that the pages are being served up from a global CDN of some kind. Is this more important than on-page optimizations? I’m not sure, though I’ve noticed some sites do really well even if served from a single IP. It also doesn’t seem to matter a whole lot if the site runs with a database or is static. (I tested a few static sites that performed horribly, despite being pretty well optimized code-wise, while some database-driven sites performed really well.)

What are you best tips/thoughts on speeding up page load times?

I am delighted that my joke site has very good results, a CDN is not used and the $5.00/month server is based in San Fransisco:

Location  Website 	Connection 	First Byte 	Total
Average response time	0.223 secs	0.365 secs	0.491 secs

I have used most of the free sites to test performance, also most importantly to read and follow the suggestions. To achieve very good results is not easy, especially with a slow web host provider although there are many simple improvements which make a tremendous difference. Reducing image, Css and JavaScript file sizes will quickly make the biggest impact.

All my sites are now based on this article along with reducing the above mentioned file sizes.

Edit:
Fast sites are a blessing with slow mobile connections and most probably accounts for about 60% traffic from mobiles and tablets. It is also important for sites to be responsive to cater for the increase in mobile usability.

1 Like

Thanks John. Your Eureka moment is over my head, though.

My only experience is using an Apache web server but it should be similar to all other web servers.

Normally a browser requests a web-page from the server. This involves in loading Apache, Php and MySql to extract the data which is then returned and rendered according to the optional Css file.

To elimate the PHP and MySql processing time it is quite easy.

To create a “cached-web-page.html”:

  1. Render any web-page from your “http://www.your-site.com
  2. View the content (right-click and select View Content)
  3. Copy the complete content (highlight and copy to memory)
  4. Paste and upload the content to a file on your server (maybe “cached-web-page.html”)
  5. Call “cached-web-page.html”
  6. Monitor the page speed differences using a free web-site monitor program.

The tricky bit is to automate:

  1. Storing a rendered web-page to a cache folder
  2. Use .htaccess file to test and render the “cached-web-page.html” only if it exists.
  3. Otherwise fall-through and create the web-page as per usual.

It is necessary to mention that PHP sessions cannot be used on the .html web-page but JavaScropt works OK.

1 Like

Indeed Github pages are very light, many well under 0.5mb compressed with few images/js/http requests - that’s a clue!

Yeah, although I’ve tested incredibly light pages—no images, JS or even CSS and not gotten nearly the same results. So even acknowledging all the advice on page weight reduction, minimizing requests etc., I still wonder if there’s something in the mix that far outweighs all other factors. I was thinking it might be the CDN factor, which can greatly reduce latency in some corners of the world, though it seems some sites served from a single IP can get excellent results, too. So then I ask myself, is one server much much faster than another etc.

I guess I’m just being lazy, but it seems like an awful lot of work to use all the services that are recommended, such as PageSpeed Insights and all that.

Indeed it is. In fact, serving from a CDN is the second most important thing you can do. Only minimizing HTTP requests is more important than a CDN.

2 Likes

The GitHub Pages homepage that Ralph linked to in the original post weighs in at just under 3MB. And yet, as Ralph’s results show, GitHub Pages loads in a flash from almost anywhere in the world. It’s easy for us to assume that page weight has a significant impact on performance, but measurement has repeatedly shown that assumption to be false.

2 Likes

Indeed. If they’re not using a CDN, then results very much depend on where you’re testing from. John_Betong’s results for his jokes site linked above is actually a great example. When tested from Los Angeles, he gets a ridiculously fast load time – 32ms. But from the other side of the US? New York drops to 273ms. And from London? 846ms.

2 Likes

What threw me a bit is that I tested some sites that did really well all around the world seemingly from one location, though I may have misinterpreted that. They show the same IP address for each location, although digging a bit deeper, their name servers are hosted by Cloudflare, which is a CDN. An example—featured on the Sucuri site—is whocaresarts.com. Stunning load results worldwide.

Huh… At first glance, I don’t have a good explanation for that. :-/

I might have to skim through CloudFlare’s documentation to see if there’s some magic in how they work.

It seems they take a cached version and put it on servers all around the world. Or something like that. They seem to offer a version of this for free, too. Wow.

I see there is a lot of confusion in this thread on the topic of “fast-loading” websites.

First off, it is important to understand that the result provided by Sucuri is only a measure of:

  1. How fast they get a response on the connection request to the website.
  2. How fast they start receiving the data requested.
  3. How fast they received all of the data they requested.

In this it is also very important to understand that on point 3. this ONLY means the initial HTML content. They do not load any images, css files, js files etc.

When you take this in mind, the result provided by their service is not that important. I would not say it is useless, but it is close to being useless.

Today, what is important is how fast your web page render for the user. With other words, the total time it takes for the user from the second they access your web page, until it is rendered and usable on their browser.

This include, the time it takes to download the required html code, css files, js files and parse the js code.

When you start looking at this, you can get a surprise as the website you thought was loading fast, due to all services similar to Sucuri say it does, is actually very slow.

This is important due to as we all know, waiting on a website is not something that a user want to do, and if the pages render too slow and they have another option, the chance of them leaving your website is quite high.

The problem here is that this is a number that depends on many factors, and it can be difficult to find out exactly how long your average user takes to render your page (i.e. depends on their internet connection, their browser, OS and computer speed).

It was first when we started using New Relic a few years back that it became easier to measure the real render speed for the websites we manage, and from that information we could find the real bottlenecks and sort them out.

In regards to using a CDN service, this is not that important until you reach a critical mass in traffic. What I mean with this, is that by applying a CDN service when your website become popular, will in most cases mean that you can use your current hardware for a longer time. Since you offload some of the traffic to the CDN, making it a cost effective option to upgrading the hardware. Of course having a CDN will in most cases reduce the load time for your users from different parts of the world, but it is not always certain it will reduce it with that much.

It is also important to note that a CDN help deliver static content to your users. If you allow your users to login, show unique content per user etc. then this still has to be handled by your server.

This leads us to why larger websites has many different data centers, basically in addition to CDN they also place a version of the website, including the database (syncing with the rest of the data centers) in the same region. Greatly increasing the load speed for their users, since any requests that require unique content is also sent to a server park in the same region as the user.

A CDN normally have two option to store and serve content. This is Pull and Push, where the normal one is Pull.

Pull:
With this one, you use the CDN url on your website for all the static content you want to serve through them. Then when they receive a request for any content, and if they do not have it stored already they will try to get the information from the location you have entered into your account (this will relate to the root path of your website etc.) and apply any cache timeout setting according to what you chose.

Then on the next request to the same content, if the content has not expired on their server, they will serve it directly.

Push:
With this one, you use the CDN url on your website similar to above, but you actually have to upload the content to them. Either by providing the links to each file, or uploading each time.

What is important to note, is that with the Pull option. The CDN will normally strip away all GET parameters from the content, which means if your system serve a different image, css file etc. depending on the GET parameters you will get a invalid cache problem. Meaning that you will serve wrong content to many of your users.

Most CDN’s has a option to enable using GET parameters for cache generation, but if you enable this. It is vital that you make certain that this will not invalidate all cache attempts at the CDN, due to all of your users receive unique GET parameters from the server (since in this case the CDN would never be able to reuse the cache).

In addition, when you do an update, you do not want to invalidate all of the cache at the CDN, but instead the updated files. This is something that can be difficult with the Pull version.

1 Like

Thanks for your great reply, @TheRedDevil. Yep, I definitely admit to being confused. :stuck_out_tongue:

Yeah, I suspected that. However, if you find that it’s taking 4 or more seconds for that initial HTML content to reach certain parts of the world, I’d still assume that’s something to be concerned about—given that some sites reach all parts of the world in milliseconds.

The books and articles I’ve read on fast websites sort of leave the issue of latency to one side, which made me wonder more about this side of things. But I do realize that optimizing a site’s assets etc. is crucial to good load times. I just sometimes wonder if it’s worth the trouble if your server is still slow to deliver to various locations.

That is true, but it a scenario like this it would in most cases be due to the user is on a very slow internet connection. A scenario that could do this is if the user is accessing the site using their mobile over a bad EDGE connection.

With other words, if you had a CDN it would not have helped the user in the example above anyway.

On a side note, I forgot to mention this in my previous reply.

On the latency tests like “Sucuri” it is important you enter the website domain the way it is used.

If the website is accessed with www. you need to use it infront. If it is not used with that, you should not have it.

The reason for this is due to if you append it when it is not used etc. there will be an extra DNS lookup required, which will greatly increase the time.

1 Like

Another point to consider as far as desktops are concerned I think the majority of sites will be opened within a tab while the user continues to read the current page. Personally, after searching, I always open about three tabs and then open the first which hopefully has rendered.

I have yet to find a mobile browser which handles tabs gracefully and usually open a site and really notice the rendering time.

Off Topic:

why is that some sites refuse to render until all the adverts have been downloaded?

That is due to how they include the JS code.

If you include and initiate the JS code at the top of the page, then the browser will execute the code before it continue downloading and rendering the rest of the page. (To clarify, if you add the JS code at the middle of the code, then the browser will render up to that point, then render the JS code, before it continue with the rest.)

The reason this happen, is due to the browser need to make certain if the JS code is doing any changes to the content of the page.

1 Like

Many thanks for the simple explanation.

[off topic]
I now remember a post from @Black_Max about delaying rendering of the top menu/logo div and to ensure the SEO page content was called before the menu text.

His technique was:

  1. create a blank menu/logo placeholder at the top of the page
  2. store the actual menu/logo content just before </body>
  3. use position:fixed; top:0; left:0 fill in the reserved space.

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.