Creating Accurate Timers in JavaScript

Asynchronous timers are the cornerstone of all time-based processes in JavaScript. From obvious things like clocks and stopwatches, to visual effects and animation, to the synchronised delays that are vital to the usability of dropdown menus.

But the problem with JavaScript timers is that they’re not very accurate. We couldn’t make a stopwatch just by incrementing x, because it wouldn’t stay in time:

var time = 0,
    elapsed = '0.0';

window.setInterval(function()
{
    time += 100;

    elapsed = Math.floor(time / 100) / 10;
    if(Math.round(elapsed) == elapsed) { elapsed += '.0'; }

    document.title = elapsed;

}, 100);

Web browsers, like all applications, take turns for a piece of CPU time, and the time they have to wait will vary, depending on the load. This is what causes the latency in asynchronous timers — a 200ms timer may actually take 202ms, or 204, and this will gradually send the stopwatch out of time.

The solution in this case is not to rely on the speed of the timer at all, but rather, to query the system time freshly each loop and derive the output from that:

var start = new Date().getTime(),
    elapsed = '0.0';

window.setInterval(function()
{
    var time = new Date().getTime() - start;

    elapsed = Math.floor(time / 100) / 10;
    if(Math.round(elapsed) == elapsed) { elapsed += '.0'; }

    document.title = elapsed;

}, 100);

But what about less literal applications, like animation — can we use the same approach to make them just as accurate?

Can’t beat the system

Recently I’ve been working on some visual transitions, and I found myself pondering exactly that question: If a user specifies a 5-second animation, what can we do to make that animation really last 5 seconds, and not 5.1 or 5.2? It may be a small difference, but small differences add up. And anyway, making things better is an end in its own right!

So to cut to the chase — we can indeed use the system clock to compensate for timer inaccuracy. If we run an animation as a series of setTimeout calls — each instance calling the next — then all we have to do to keep it accurate is work out exactly how inaccurate it is, and subtract that difference from the next iteration:

var start = new Date().getTime(),
    time = 0,
    elapsed = '0.0';

function instance()
{
    time += 100;

    elapsed = Math.floor(time / 100) / 10;
    if(Math.round(elapsed) == elapsed) { elapsed += '.0'; }

    document.title = elapsed;

    var diff = (new Date().getTime() - start) - time;
    window.setTimeout(instance, (100 - diff));
}

window.setTimeout(instance, 100);

To see the practical effect of all this, I’ve prepared a demo that illustrates the difference between normal and adjusted timers — Self-adjusting timer examples.

The demo has three examples:

  1. The first example (left) is just an ordinary timer implemented with setInterval, that shows the cumulative difference between the elapsed time by iteration count, and the actual time by the system clock;
  2. The second example (center) ramps-up the amount of work the browser is doing on each iteration, to show how more work means more latency, and therefore much greater inaccuracy;
  3. The third example (right) does just as much work as the second, but now uses the self-adjusting technique, to illustrate what a profound difference it makes to the overall accuracy;

The great thing about this approach is that it really doesn’t matter how inaccurate the timer would otherwise be, the adjustments will always keep it in time. A small amount of constant latency will be easily compensated for, but equally, a sudden large spike of latency caused by a surge of processor use (like starting an application) is just as easily moderated. And we can see from the un-adjusted examples, how even though one-iteration’s worth of inaccuracy is quite small, the cumulative effect can be incredibly large.

Of course even the adjusted timer can’t compensate 100% — it adjusts the speed of the next iteration, and so can’t compensate for the latency of the last iteration. But still, whatever this difference amounts to, it will be tiny compared with the cumulative effect of that discrepancy multiplied by hundreds or thousands of instances.

Go with the flow

I said earlier that I had this idea while working on animation (for an update to my popular Image transitions), and the following is the abstraction I eventually came up with. It implements a self-adjusting timer, calculates the speed and steps from the input length and resolution, and provides callbacks for on-instance and on-complete:

function doTimer(length, resolution, oninstance, oncomplete)
{
    var steps = (length / 100) * (resolution / 10),
        speed = length / steps,
        count = 0,
        start = new Date().getTime();

    function instance()
    {
        if(count++ == steps)
        {
            oncomplete(steps, count);
        }
        else
        {
            oninstance(steps, count);

            var diff = (new Date().getTime() - start) - (count * speed);
            window.setTimeout(instance, (speed - diff));
        }
    }

    window.setTimeout(instance, speed);
}

Here’s a simplified example of its use, which does an opacity fade on an image (using standard syntax only) over 5 seconds at 20 frames/second — Self-adjusting opacity fade:

var img = document.getElementById('image');

var opacity = 1;
img.style.opacity = opacity;

doTimer(5000, 20, function(steps)
{
    opacity = opacity - (1 / steps);
    img.style.opacity = opacity;
},
function()
{
    img.style.opacity = 0;
});

And there you have it — self-adjusting timers make for better animation, giving you confidence that when you specify a 5-second effect, you’ll get one that lasts 5 seconds!

Thumbnail credit: Yukon White Light

Free book: Jump Start HTML5 Basics

Grab a free copy of one our latest ebooks! Packed with hints and tips on HTML5's most powerful new features.

  • Anonymous

    I haven’t checked this or anything and when I tried to look it up I couldn’t find it. If you send a negative value into setTimeout, will it give an error, use zero, or what? This could be a potential problem if there is a huge spike in usage and/or the interval is short because that would cause the interval that is sent into setTimeout to be negative.

    If negative numbers would end up being problematic, then you could just add a check for the negative number, and if it is negative, then just skip the setTimeout and run the instance() function immediately.

  • http://bitdepth.wordpress.com/ mmj

    There’s a few issues here – the approach in this article may not actually work (at least, on some operating systems).

    There are two things to keep in mind with traditional timers (settimeout, setinterval) in Javascript.

    One is that browsers enforce a minimum delay in milliseconds. This is usually less than 15ms and various not only across browsers but across versions and ports of browsers. In Firefox on Linux it’s typically 10 or 11ms, on Windows it’s limited by the Windows timers (see below) unless an app (like Chrome) which puts Windows into accurate timer mode is running. On Chrome, the minimum’s typically 2 or 4 ms depending on version.

    The other is that the browser and operating system may only access a timer with a rather coarse resolution. Windows XP’s timer (and this is probably the case with other versions of Windows) has 64 ticks per second, so any Javascript timer in a browser in Windows will only be accurate to the _nearest_ 15.625 milliseconds. This is quite a different problem to the above – it means that not only is the timer going to be inaccurate to that degree, but if you check the time using Date(), it will only return a value accurate to the nearest such tick. In this situation, any attempt to ‘correct’ for inaccuracy by checking Date() will fail, because Date() is equally clueless about what the real, accurate time is. Given that Date() too will only return a time to the nearest 15.625 millisecond block, you can’t use it “see how off” your timer is or “compensate”.

    Now the interesting thing is that even on Windows, Chrome has accurate timers and Date(), because it switches Windows into a higher resolution timer mode. The even more interesting thing here is that this means that while Chrome is running, Firefox (but not IE) also runs with the better resolution timer! A third thing to keep in mind is that running an operating system inside a virtual machine (in case you’re developing on say, a Mac or Linux) might also affect the timer resolution within that OS.

    Ancient versions of Windows had even coarser timer resolutions, but eliminating anything pre-XP it’s safe to assume that when using Javascript on any modern browser on a modern operating system, settimeout, setinterval, and querying the time using things like Date() will use no coarser a timer than 64 ticks per second (accurate to the nearest 15.625ms tick).

    All of this means that getting smooth animation in Javascript is really, really tricky and also hard to test. One approach I’ve seen is just to set a timer for the minimum timeout (ie, 0ms) and call it as fast as you can. But this is relying on the browser’s “minimum delay” – in a future browser without such a delay you could lock up your browser endlessly. Another approach is to optimistically set a timeout value like 10ms and just hope that it’ll work.

    There are new APIs for Javascript which can set more accurate timers, including using separate worker threads (one possible solution), but these don’t work cross-browser yet. Still, they may improve the state of Javascript animation in future.

  • pushinpaul

    There seems to be Latency within all digital mediums, such as audio in DAW’s where timing errors would be looked upon and ticks per second or per minute can run of and give errors. The solution is to remove ticks every 60 seconds or 120 seconds to allow for errors. With the combination of various browsers and various Operating systems this will always present problems.