JavaScript
Article

Saving Bandwidth on Slow Connections with Saveba.js

By Aurelio De Rosa

Performance, accessibility, and security are the most discussed topics of the last months, at least in my opinion. I’m very interested in them and I try to get my head around each subject by reading the new techniques and best practices unveiled by gurus of these fields. If you’re a front-end developer you should too, because these are the hottest subjects right now.

In this article I’ll focus on performance by discussing a JavaScript library I’ve developed, called Saveba.js. It tries to improve the performance of a website, and thus the experience of users, by avoiding the download of some resources based on the user’s connection. I’ll also explain why I developed it, detailing what the problems with the current approaches used by developers.

The Problem

When people talk about performance, the discussion always ends up including mobile. It’s certainly true that a website should be optimized for any device and connection, but often home and office connections are faster than mobile ones. Some of the most common techniques to optimize a website today are combining and minifying CSS and JavaScript files, loading JavaScript files asynchronously, providing modern font formats (WOFF and WOFF2), optimizing for the critical rendering path, etc.

Another important concept to take into account is the optimization of images. Based on the last report of HTTPArchive, images represent more than the 60% of a page’s total weight on average. To address this issue, many developers use tools like Grunt or Gulp, or services like TinyPNG or JPEGMini, to reduce their weight. Another practice is to employ the new srcset attribute and the new picture element to provide versions of the images optimized for the size of the viewport. But this is not enough.

Back in August, I wrote an article about the Network Information API, where I expressed my concern about the limitations of this approach. In particular I wrote:

While this approach works well for serving up images of the right size and resolution, it isn’t ideal in all situations, video content being one example. What we really need in these cases is more information about the device’s network connection.

What I wanted to express is that if a user is on a really, really slow connection he/she may not care about some embellishment images or resources in general and want to focus on what really matters. Consider the following images that represents the current version of my website as seen on a Samsung Galaxy S3:

A screenshot of the Aurelio De Rosa's website as seen on a Samsung Galaxy S3

In this screenshot I have marked with a red border two images: a logo and an image of me. Now, the question is: “would a user with a 2G connection care about those images, even if they are heavily optimized?” Not surprisingly, the answer is “No!” So, even if I can optimize those images for small devices, what I really need is to avoid their download completely for users having a given type of connection or a set of connections like GPRS, EDGE, and UMTS. On the other hand, I do want to show those images for users visiting my website on a small device that are using a fast connection. My attempt to solve this issue ended up with the creation of Saveba.js.

Introducing Saveba.js

Saveba.js is a JavaScript library that, relying on the Network Information API, tries to save bandwidth for users with a slow connection by removing unnecessary resources (at the moment images only). By “removing” I mean that Saveba.js replaces the images with a 1×1 px transparent GIF, so that users won’t have broken images while browsing the website. In regard to what’s considered unnecessary, the library considers any non-cached image as unnecessary if the user’s connection is slow. Any non-content image (images having an empty alt attribute) that aren’t in the browser’s cache are considered unnecessary for average connections. If the user has a fast connection the library won’t perform any operation.

For more information about how connections are classified, refer to the README of the library. Note that because Saveba.js is in a very early stage I strongly suggest you to not use it in production. However, you may want to keep an eye on it.

Key Points of Saveba.js

In this section I’ll highlight some parts of the code to show you how I made the library. First, I set up some default values that will help in classifying the connection in use and in avoiding any changes on any resource the developer wants to be ignored:

// Default values.
   // Later exposed as saveba.defaults
   var defaults = {

   // A NodeList or an Array of elements the library must ignore
   ignoredElements: [],

   // A Number specifying the maximum speed in MB/s after which
   // a connection isn't considered slow anymore
   slowMax: 0.5,

   // A Number specifying the minimum speed in MB/s after which
   // a connection is considered fast
   fastMin: 2
};

The second step is to detect if the browser in use supports the Network Information API. If the API isn’t implemented, I terminate the execution of the code:

var connection = window.navigator.connection    ||
                 window.navigator.mozConnection ||
                 null;

// API not supported. Can't optimize the website
if (!connection) {
   return false;
}

The third step is to classify the connection in use based on the current configuration and on the version of the API supported:

// Test whether the API supported is compliant with the old specifications
var oldApi = 'metered' in connection;
var slowConnection = (oldApi && (connection.metered || connection.bandwidth < defaults.slowMax)) ||
   (!oldApi && (connection.type === 'bluetooth' || connection.type === 'cellular'));
var averageConnection = oldApi &&
   !connection.metered &&
   connection.bandwidth >= defaults.slowMax &&
   connection.bandwidth < defaults.fastMin;

Next, I retrieve the resources the library can optimize (at the moment images only) and filter those that are in the browser’s cache or the developer wants to be ignored:

var elements;
if (slowConnection) {
   // Select all images (non-content images and content images)
   elements = document.querySelectorAll('img');
} else if (averageConnection) {
   // Select non-content images only
   elements = document.querySelectorAll('img[alt=""]');
}
elements = [].slice.call(elements);

if (!(defaults.ignoredElements instanceof Array)) {
   defaults.ignoredElements = [].slice.apply(defaults.ignoredElements);
}

// Filter the resources specified in the ignoredElements property and
// those that are in the browser's cache.
// More info: http://stackoverflow.com/questions/7844982/using-image-complete-to-find-if-image-is-cached-on-chrome
elements = elements.filter(function(element) {
   return defaults.ignoredElements.indexOf(element) === -1 ? !element.complete : false;
});

Finally, I replace the remaining resources with the placeholder by keeping a reference to the original source in an attribute called data-saveba:

// Replace the targeted resources with a 1x1 px, transparent GIF
for(var i = 0; i < elements.length; i++) {
   elements[i].dataset.saveba = elements[i].src;
   elements[i].src = transparentGif;
}

How to Use It In Your Website

To use Saveba.js, download the JavaScript file contained in the “src” folder and include it in your web page.

<script src="path/to/saveba.js"></script>

The library will automatically do its work, so you don’t have to call any method. Saveba.js also exposes a global object called saveba, available as a property of the window object, in case you want to configure it or undo its changes via the destroy() method.

In the next section we’ll briefly discuss how to use the destroy() method, while for the configuration you can refer to the official documentation (I don’t want to duplicate content).

destroy()

In case you want to remove the changes performed by Saveba.js, you can invoke the destroy() method of the saveba object. For example, let’s say that the page on which the changes have been performed has a button with ID of show-images-button. You can add an event listener to the click event that restores all the resources as shown below:

<script>
document.getElementById('show-images-button').addEventListener('click', function(event) {
   saveba.destroy();
});
</script>

Supported Browsers

Saveba.js relies completely on the presence of the Network Information API, so it works in the same browsers that support this API, which are:

  • Firefox 30+. Prior to Firefox 31, the browser supports the oldest version of the API. In Firefox 31 the API has been disabled on desktop
  • Chrome 38+, but it’s only available in Chrome for Android, Chrome for iOS, and ChromeOS
  • Opera 25+
  • Browser for Android 2.2+

Demo

To see Saveba.js in action you can take a look at the live demo.

Conclusion

In this article I described some limitations of the current practices to optimize a website that lead me to create Saveba.js. The latter is a JavaScript library that, relying on the Network Information API, tries to save bandwidth to users having a slow connection by removing unnecessary resources. After introducing it, I explained how the library works and how you can use it in your website, although at the moment you really shouldn’t use it in production.

Once again, I want to reinforce that this is a heavily experimental library and the solution used in not bullet-proof. Whether you liked it or disliked it, I’d really love to know your opinion, so I invite you to comment.

  • Vladimir López

    Do i must include the library at the top of my HTML document? Or i can put it anywhere? If so, how is supposed the library to help me if maybe when she comes in action the DOM is already loaded (with the images already donwloaded as well)?

  • Aurelio De Rosa

    You can place at the bottom of your page but you must not use attributes like async and defer. When the DOM is ready no images have been downloaded yet: when this is done, you have the load event. If you don’t believe me, you can simply try the demo :)

  • Adam Youngers

    Very interesting project. It seems like something browser vendors should have baked in. I know you can opt to load images in some but it is not based off network conditions to my knowledge. One suggestion I would have is that instead of loading a transparent gif, load a div with an option to selectivity load images. Similar to how some of the ad block plugins work.

  • Alexander Farkas

    @aurelioderosa:disqus
    I don’t think you can reliably do this in most browsers in real browsers (think about speculative img parsing etc.). For example sitepoint start to fully download 4 images in Firefox before the domready event even started on my connection. Additionally Safari mobile and Desktop does never abort/stop an already queued/started download, if the img.src was changed. So not even the simple demo does work there. Your demo works for me in Chrome and IE (but again, in real website with more JS -> preload parser might pick it up). In FF sometimes an already pending img request is aborted. Although, i don’t think, that your approach does work in general, I’m curious wether you know how Android Stock browsers handle does cases. Do you have some tests?

    • Aurelio De Rosa

      Hi.

      As I wrote in the article this is more a proof-of-concept at the moment, so it’s really NOT intended to be used in any website. It’s just a 0.1.0 that as you might know means that’s pretty much an experiment with a very low reliability (pretty much as every 0.1.0 version).

      If I’ll be able to let it works in more cases (video, audio, and so on) than you could consider it as a real library. Said that, even if the demo will work in some browsers for me is a good start. If you do nothing you can optimize for the 0% of your users, if you can do something you could improve for 40-50% of your users. For me this is a non-trivial improvement. Even more considering that the kind of resources I’m trying to target are what really slow down a website, I’d consider it a success.

      With that in mind, consider that browsers have bug too so sometimes it isn’t (or it might be not) a problem of the library itself. For example Firefox has an issue in regard of how it manage resources, thus I’ve failed a bug: https://bugzilla.mozilla.org/show_bug.cgi?id=1067541

      I’ve also seen I weird behavior in Chrome, Opera, and Firefox and how they deal with the video element that I’ve reported here as an issue: https://code.google.com/p/chromium/issues/detail?id=426305

      In regard of Safari mobile, I don’t have an iPhone so I haven’t tested it.

      In conclusion, this is really an open and new field with a heavily experimental API. So it’s possible that it’ll fail or succeed, who knows. I haven’t seen other experiments like mine in the wild, so I can’t refer to others performance tests or anything else. I’m excited about its potentiality but this story may end with a simple “This library has been discontinued because it’s unable to help most users, sorry about that.”

      • Alexander Farkas

        Aurelio De Rosa

        You haven’t really answered my question here ;-). I just tested with Android Stock Browser (version 4.1) and came up with the same result like iOS. Attaching a new src to an image, while the src is downloading doesn’t cancel/abort img downloading. While I get your point, that this behavior can and should be imrpoved by browser vendors, the current reality is different (both iOS and Adnroid do download images…) and not showing images, that are actually downloaded is a downgraded user experience.

        I’m currently heavily interested in how img downloading and so on works in browsers, because of two of my projects:
        https://github.com/aFarkas/lazysizes
        https://github.com/aFarkas/respimage

        And your article had there already some influence (I’m handling Chrome and IE) differnet from other browsers now.

        But I think, your approach is the wrong way to see it. This can be best explained with the chrome issue you filled. All browser have a speculative pre-parser to preload or queue loading especially stylesheets, scripts and images (http://calendar.perfplanet.com/2013/big-bad-preloader/). This happens long time before any JS is executed. As long as video isn’t part of this pre-parser algorithm, your chrome issue could be fixed, but it would be simply the wrong way. Because performancewise the video element should be also considered inside of the preload parser. Addiotnally this means, on a real website with a lot of stylesheets and blocking and non-blocking other JS, your script can never be executed, before img downloading actually starts. For example: http://www.smashingmagazine.com/. With my current connection and FF smashingmagazine has fully downloaded 10 images and has furthermore started to partially download 9 image **before** domcontentloaded is triggered. Even if you are using mutationobservers to get all img as soon as possible, the preload parser is simply executed long time before you get them.

        This means: Instead of serving a video with preload=”auto” and switching to preload=”none” you should serve it as preload=”none” and then switch it conditionally to preload=”auto”.

        Or gernerally speaking, instead of serving a heavy site and then trying to fix this, if lowconnection is detected, you should serve a lowconnection website and switch it to heavier images, scripts and so on, if you detect good bandwidth connection.

  • Aurelio De Rosa

    Hi Adam.

    Thank you for your suggestion. I really like your idea and I’m thinking about implementing it.

  • Aurelio De Rosa
Recommended
Sponsors
Because We Like You
Free Ebooks!

Grab SitePoint's top 10 web dev and design ebooks, completely free!

Get the latest in JavaScript, once a week, for free.