Tackling Render Blocking CSS for a Fast Rendering Website

By Maria Antonietta Perna
We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now

Tackling Render Blocking CSS for a Fast Rendering Website

This article is part of a series created in partnership with SiteGround. Thank you for supporting the partners who make SitePoint possible.

I’ve always thought deciding how and when to load CSS files is something best left to browsers. They handle this sort of thing by design. Why should developers do anything more than slap a link element with rel="stylesheet" and href="style.css" inside the <head> section of the document and forget about it?

Apparently, this may not be enough today, at least if you care about website speed and fast-rendering webpages. Given how both these factors impact on user experience and the success of your website, it’s likely you’ve been looking into ways you can gain some control over the default way in which browsers deal with downloading stylesheets.

In this article, I’m going to touch on what could be wrong with the way browsers load CSS files and discuss possible ways of approaching the problem, which you can try out in your own website.

The Problem: CSS Is Render Blocking

If you’ve ever used Google Page Speed Insights to test website performance, you might have come across a message like this:

Message from Google Page Speed Insights about render blocking CSS.

Here Google Page Insights is stating the problem and offering a strategy to counteract it.

Let’s try to understand the problem a bit better first.

Browsers use the DOM (Document Object Model) and the CSSOM (CSS Object Model) to display web pages. The DOM is a model of the HTML which the browser needs in order to be able to render the web page’s structure and content. The CSSOM is a map of the CSS, which the browser uses to style the web page.

CSS being on the critical rendering path means that the browser stops rendering the web page until all the HTML and style information is downloaded and processed. This explains why both HTML and CSS files are considered to be render blocking. External stylesheets especially involve multiple roundtrips between browser and server. This may cause a delay between the time the HTML has been downloaded and the time the page renders on the screen.

The problem consists in this delay, during which users could be staring at a blank screen for a number of milliseconds. Not the best experience when first landing on a website.

The Concept of Critical CSS

Although HTML is essential to page rendering, otherwise there would be nothing to render, can we say the same about CSS?

Of course, an unstyled web page is not user friendly, and from this point of view it makes sense for the browser to wait until the CSS is downloaded and parsed before rendering the page. On the other hand, you could argue that not all style information is critical to building a usable page. What users are immediately interested in is the above the fold portion of the page, that is, that portion which users can consume without needing to scroll.

This thought is behind the dominant approach available today to solve this problem, including the suggestion contained in the message from Google Page Insights reported above. The bit of interest there reads:

Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.

But how do you defer or asynchronously load stylesheets on your website?

Not as straightforward as it might sound. You can’t just toss an async attribute on the link element as if it were a <script>
tag.

Below are a few ways in which developers have tackled the problem.

Take Advantage of Media Types and Media Queries to Minimize Render Blocking

Ilya Grigorik illustrates a simple way of minimizing page rendering block which involves two stages:

  • Split the content of your external CSS into different files according to media type and media queries, thereby avoiding big CSS documents that take longer to download
  • Reference each file with the appropriate media type and media query inside the link element. Doing so prevents some stylesheet files from being render blocking if the conditions set out by the media type or media query are not met.

As an example, references to your external CSS files in the <head> section of your document might look something like this:

<link href="style.css" rel="stylesheet">
<link href="print.css" rel="stylesheet" media="print">
<link href="other.css" rel="stylesheet" media="(min-width: 40em)">

The first link in the snippet above doesn’t use any media type or media query. The browser considers it as matching all cases, therefore it always delays page rendering while fetching and processing the resource.

The second link has a media="print" attribute, which is a way of telling the browser: “Howdy browser, make sure you apply this stylesheet only if the page is being printed”. In the context of the page simply being viewed on the screen, the browser knows that stylesheet is not needed, therefore it won’t block page rendering to download it.

Finally, the third link contains media="(min-width: 40em)", which means that the stylesheet being referenced in the link is render-blocking only if the conditions stated by the media query match, i.e., where the viewport’s width is at least 40em. In all other cases the resource is not render blocking.

Also, take note that the browser will always download all stylesheets, but will give lower priority to non-blocking ones.

To sum up, this approach doesn’t completely eliminate the problem, but can considerably minimize the time browsers block page rendering.

Using the preload Directive on the Link Element

A cutting-edge feature you could leverage to counteract the block of the web page by browsers as they fetch external stylesheets, is the standard-based preload directive.

preload is how we, as developers, can tell browsers to fetch specific resources because we know the page is going to need those resources quite soon.

Scott Jehl, designer and developer working for The Filament Group, can be credited to be the first one who tinkered with this idea.

Here’s what you need to do.

Add preload to the link referencing the stylesheet, add a tiny bit of JavaScript magic using the link’s onload event, and the result is like having an asynchronous loading of the stylesheet directly in your markup:

<link rel="preload" as="style" href="style.css" onload="this.rel='stylesheet'">

The snippet above gets the browser to fetch style.css in a non-blocking way. After the resource has finished loading, i.e., the link’s onload event has fired, the script replaces the preload value of the rel attribute with stylesheet and the styles are applied to the page.

At this time browser support for preload is confined to Chrome and Opera. But fear not, a polyfill provided by the Filament Group is at hand.

More on this in the next section.

The Filament Group’s Solution: From Inlining CSS to HTTP/2 Server Push

Solving the problem of delayed page rendering due to fetching of critical resources has long been the highest priority for the Filament Group, an award-winning design and development agency based in Boston:

Our highest priority in the delivery process is to serve HTML that’s ready to begin rendering immediately upon arrival in the browser.

The latest update on optimizing web page delivery outlines their progress from inlining critical CSS to super efficient HTTP/2 Server Push.

The overall strategy looks like this:

  1. Any assets that are critical to rendering the first screenful of our page should be included in the HTML response from the server.
  2. Any assets that are not critical to that first rendering should be delivered in a non-blocking, asynchronous manner.

Before HTTP/2 Server Push, implementing #1 consisted in selecting the critical CSS rules, i.e., those needed for default styles and for a seamless rendering of the above the fold portion of the webpage, and cramming them in a <style> element inside the page’s <head> section. Filament Group devs devised Critical, a cool JavaScript plugin, to automate the task of extracting critical CSS.

With recent adoption of HTTP/2 Server Push, Filament Group has moved beyond inlining critical CSS.

What is Server Push? What is it good for?

Server push allows you to send site assets to the user before they’ve even asked for them. …

Server push lets the server preemptively “push” website assets to the client without the user having explicitly asked for them. When used with care, we can send what we know the user is going to need for the page they’re requesting.

Jeremy Wagner

In other words, when the browser requests a particular page, let’s say index.html, all the assets that page depends upon will be included in the response, e.g., style.css, main.js, etc., with a huge optimization boost on page rendering speed.

Putting into practice #2, asynchronous loading of non critical CSS, is done using a combination of the preload directive technique discussed in the previous section and LoadCSS, a Filament Group’s JavaScript library that includes a polyfill for those browsers that don’t support preload.

Using LoadCSS involves adding this snippet to your document’s <head> section:

<link rel="preload" href="mystylesheet.css" as="style" onload="this.rel='stylesheet'">
<!-- for browsers without JavaScript enabled -->
<noscript><link rel="stylesheet" href="mystylesheet.css">
</noscript>

Notice the <noscript> block that references the stylesheet file as usual to cater for those situations when JavaScript goes wrong or has been disabled.

Below the <noscript> tag, include LoadCSS and the LoadCSS rel=preload polyfill script inside a <script> tag:

<script>
/*! loadCSS. [c]2017 Filament Group, Inc. MIT License */
(function(){ ... }());
/*! loadCSS rel=preload polyfill. [c]2017 Filament Group, Inc. MIT License */
(function(){ ... }());
</script>

You can find detailed instructions on using LoadCSS in the project’s GitHub repo and see it in action in this live demo.

I lean towards this approach: it’s got a progressive enhancement outlook, it’s clean and developer friendly.

Placing the Link to the Stylesheet Files in the Document Body

According to Jake Archibald, one drawback of the previous approach is that it relies on a major assumption about what you or I can consider as being critical CSS. It should be obvious that all styles applied to the above the fold portion of the page are critical. However, what exactly constitutes above the fold is all but obvious because changes with viewport dimensions affect the amount of content users can access without scrolling. Therefore, one-size-fits-all is not always a reliable guess.

Jake Archibald proposes a different solution, which looks like this:

<head>
</head>
<body>
<!-- HTTP/2 push this resource, or inline it -->
<link rel="stylesheet" href="/site-header.css">
<header>…</header>

<link rel="stylesheet" href="/article.css">
<main>…</main>

<link rel="stylesheet" href="/comment.css">
<section class="comments">…</section>

<link rel="stylesheet" href="/about-me.css">
<section class="about-me">…</section>

<link rel="stylesheet" href="/site-footer.css">
<footer>…</footer>
</body>

As you can see, references to external stylesheets are not placed in the <head> section but inside the document’s <body>.

Further, the only styles that are rendered immediately with the page, either inlined or using HTTP/2 Server Push, are those that apply to the upper portion of the page. All other external stylesheets are placed just before the chunk of HTML content to which they apply: article.css before <main>, site-footer.css before <footer>, etc.

Here’s what this technique aims to achieve:

The plan is for each <link rel=”stylesheet”/> to block rendering of subsequent content while the stylesheet loads, but allow the rendering of content before it. The stylesheets load in parallel, but they apply in series.

… This gives you a sequential render of the page. You don’t need decide what’s “above the fold”, just include a page component’s CSS just before the first instance of the component.

All browsers seem to allow the placement of <link> tags in the document’s <body>, although not all of them behave exactly in the same way. That said, for the most part this technique achieves its goal.

You can find out all the details on Jake Archibald’s post and check out the result in this live demo.

Conclusion

The problem with the way in which browsers download stylesheets is that page rendering is blocked until all files have been fetched. This means user experience when first landing on a web page might suffer, especially for those on flaky internet connections.

Here I’ve presented a few approaches that aim to solve this problem. I’m not sure if small websites with relatively little and well-structured CSS would be much worse off if they just left the browser to its own devices when downloading stylesheets.

What do you think, does lazy loading stylesheets, or their sequential application (Jake Archibald’s proposal), work as a way of improving page rendering speed on your website? What is the technique you find most effective?

Hit the comment box below to share

We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now
  • M⃠ ⃠S⃠ ⃠i⃠ ⃠N⃠ ⃠L⃠u⃠n⃠d⃠

    This again?

    So the plan is STLL, for us all to bend over backward until our heads are all the way up inside our own asses to cater to that one broken POS validation-page from google,
    and to rearrange the whole web and hack our way around their tender little princess butt in more and more ridiculously roundabout ways?

    Are we REALLY supposed to split the css up and place bits and pieces of it next to the tags where its needed now?
    Where, oh, where have i seen that before?

    Hmm… but maybe if we put the css inside a data-tribute, and then use an on-load to turn it into inline css, and then.. , and then..
    Is it still blocking if we encrypt it and fetch it as a binary?
    Lets ask The Old Man in the Cave.
    100/100 !!!

    Whoopee, lets put toilet-water on ALL or pages!

    Critical CSS?
    How about critical thinking?

    And also;
    Are we seriously going to pretend that the company that made low-contrast design popular, cares about how CSS makes web pages look?

    • COULDN’T GET YOU :-

    • I would say that there’s definitely at least some truth to render-blocking things. For example, you should combine CSS into one file and serve that first, then everything else, so that at least the page “feels” faster.

      A lot of great performance can be automated. Critical inlined CSS reduces 1 HTTP request and can make the DOM faster to load, for sure.

      So you shouldn’t just circlejerk because URGHY DURGHTY GOOGLE IS BAD UGRWEGGNWOGN- something something…

      • M⃠ ⃠S⃠ ⃠i⃠ ⃠N⃠ ⃠L⃠u⃠n⃠d⃠

        Yeah good point.
        Fuck me, right?

  • jcamachott

    overthinking, much? One css file for me please…thank you.

    • Not so much .
      You ​just need to add a few link tags on the end of the body tag.

  • Woo Hoo
    This is amazing B-) B-)
    Now I can put link tags in the body.

    Just put them in the end of the body tag .
    Simple & Neat

    Instead of Media queries , I will just​ use the media attribute .
    So Cool

  • Maria Antonietta

    These techniques represent what some top-notch devs have come up with to give website visitors the experience of immediate page load. I don’t think small websites or personal blogs will lose out if they just reference a stylesheet at the top of the document and leave the browser do its job the way it is designed to do. However, big companies, especially those with websites people visit to buy stuff, are motivated to set aside a performance budget and hire developers to do some extra work (because these methodologies require more work) not because Google Page Speed Insights tells them to do so. Rather, it’s the market that tells them what users expect when they land on their websites using different devices with different capabilities: analytics, surveys, competitor analyses, etc., all tell them that if they don’t act fast their bottom line could be at risk.

    • M⃠ ⃠S⃠ ⃠i⃠ ⃠N⃠ ⃠L⃠u⃠n⃠d⃠

      Bullshit!
      “the experience of immediate page load”?

      If its fine to wait for the HTML-file to be requested, generated at the server, transferred, and rendered together with all its external resources (except of course that horrible external CSS), and THAT counts as “immediate”
      …then it bloody well also counts as “immediate ” to also get a css-file from cache, or even from the server.

      If you are to busy to wait for the css file, then HOW on earth are you going to stand waiting for the HTML-file?

      Of all the things to choose not to cache… the CSS?!

      • Maria Antonietta

        There are ways to handle all those external resources too in view of a performance gain. The CSS file from cache works great except for the first page load, which is what’s in question here. As I said, nothing is mandatory here: if you think it’s BS, that’s fine. I’m just letting you know, some great developers have come up with this to solve a specific issue, not because they didn’t know what to do with their time, but because someone out there considered this a problem and needed a solution.

  • Sujith Gopi Nath

    Great combined article :)

  • Chris Clancy

    A lot of work to simply pass page speed tool validation and negligible improvements to the end user. How about analysis of the amount of time saved and the noticeable difference for users, if any (which I doubt there is)? When you consider that devs will do all of this work and all it takes is one content editor to include a slightly bigger image file and ruin all your hard work because of the time taken for that one extra image to load. This is not to even mention the huge overhead of supporting multiple style sheets in different parts of a page – for many it is simply not practical. I’m coming at this from a CMS point of view and I consider this to be incredibly time consuming unless a large enough % your users are suffering from really poor download speeds – even then they are used to slow loading and a css file is going to make little difference. I do like the idea but I’m just not convinced that it is worth it. Let’s also remember that download speeds are getting faster for most users.

    • Maria Antonietta

      There’s more work involved, I agree, but if it’s worth or not it’s something to be decided on a case by case basis in accordance with user data, the kind of business, the budget at one’s disposal. I’m not advocating this as something we should all adhere to. If website owners look at the data and decide that initial page load is a problem (not just because they want to pass the Google Page Speed test), the techniques outlined in the article are an option. Obviously, any of those methods need to be part of an overall strategy to improve performance, they wouldn’t accomplish much by themselves.

  • I can see how this might not be worth the time and effort for sites with simple/small stylesheets, but It does make sense to load things in the order that the user needs them. As far as I understand, this is one of the core principles behind modular front-end frameworks like React, which many sites, including Facebook run on. A component based code structure for a website is probably overkill in most cases, but In terms of the development process, sites that re-use sizable UI components could benefit from having the html and css in one module (like the site’s header for example, which would load first) So yeah, probably not gonna kill your site if you don’t do this, but it won’t hurt (if done correctly) and could possibly reduce your bounce rate if people are leaving the site out of impatience for slow loading. (which I, for one, tend to do a lot)

    On a related note, I recently went to a developer conference where someone gave a talk about efficiently streaming a VR/360 video by knowing where the user is looking and only loading the high resolution portion there. The really cool part was that they have to use predictive algorithms to ‘guess’ which direction the user will move their head in next, since there is a lag with requesting and receiving streaming video data (especially the hi-res part). Similar concept to this I suppose.

  • M⃠ ⃠S⃠ ⃠i⃠ ⃠N⃠ ⃠L⃠u⃠n⃠d⃠

    Id like to see someone write an article where they call google out on their broken POS validation-page, rather than blindly recommend the reader to blindly cater to it.

    I just had another go at that thing, an I noted the following:

    * Even if you have a page with just “Hello” on it google will still complain that it cant render that one word of text because there is “render blocking” css.
    REALLY?
    What part of that word did the user have to wait for, and how long was the delay?

    * Even if you put your ENTIRE css inlined in the head, google will still complain that it cant render it because there is “render blocking” css in an external file.
    REALLY?
    Exactly WHAT part of the CSS that you already had before the page loaded, didn’t you have?

    JFC…

    • It’s not like you HAVE TO please Google…

      • M⃠ ⃠S⃠ ⃠i⃠ ⃠N⃠ ⃠L⃠u⃠n⃠d⃠

        No but I use the web.

        And I suffer the consequences of bad advice when I’m forced to use sites with shitty code every day.

        I don’t want *anyone* teaching people to bloat pages even more on top of what is already done.

        I have been using the web for 20+ years.
        And gone from dial-up to a 1Gig fiber.
        And from a 12″ SVGA screen to a 40″ UHD.
        And my computer now literally has 1000+ times more cpu-power and ram and storage.

        So why then, does the web *still* feel slow, and i still have to scroll on pages with barely any continent, etc etc.

        Use view-source on a shitty site, and you’ll know.

        And now I’m seeing more and more huge blocks of crap crammed into the head on these slow POS sites.

        This is *not* progress.

        And who’s fault is it?

        Could it be the people teaching noobs and script-kiddy devs to do stupid things like this?

        A… YUP!

        • Maria Antonietta

          There can be many reasons why a website is slow, and if you think one of them is that the site has too much inline CSS code, then someone is doing something the wrong way. This is not what any of the above techniques is about. Anyway, nothing is written in stone, this is only where things are at now. Already at the Filament Group developers have left inline CSS behind and moved on to HTTP/2 Server Push. Hopefully, even better techniques will come along, but I personally don’t think all this work is being done to please Google Page Speed.

        • Or could it be the fact that CSS is actually really fast but their JavaScript slows everything down?

          A . . . YUP!

          Do you even know how annoying you are, btw?

  • Thanks for your article. Recently i’ve set up inline critical CSS for the first time for my most recent project and together with all the other proper best practices (gzip, leverage brower caching, proper image compression, etc) the site scores 87/100. I bet with some more tweaking it might hit the 90’s. Also other online performance tests give back positive (even more than Pagespeed Insights) results.

    • Maria Antonietta

      Thank you for your contribution to this discussion, Niels. You’re integrating this technique into an overall performance strategy and it’s working. I’m happy you’re getting positive results :)

Recommended
Sponsors
Get the latest in Front-end, once a week, for free.