Does anybody know how to do an accurate timer?

All timer scripts I see use a system similar to the example below.
The example is an html file with just one timer in it that shows in the status bar.
Let it run for one minute and you'll notice the timer indicates 57 when it should be at 60... (at least on my system)

So 1000 tics does not equal 1 second, it's actually a little bit too slow, or more specifically, it's just not accurate...

It seems to be a rounding error because when using 60 000 tics, it does update the count by one at exactly the right second after one minute...

Code:
<HTML>
<HEAD>
<script LANGUAGE="JavaScript">
var gTimer1 = null;
var gTimer1Count = -1;

function Timer1()
{
	gTimer1Count++;
	window.status = "Timer1 Running - " + gTimer1Count;
	gTimer1 = window.setTimeout("Timer1()",1000);
}

function Timer1BtnStart()
{
	if (gTimer1==null)
		Timer1();
}

function Timer1BtnStop()
{
	if (gTimer1!=null)
	{
		window.clearTimeout(gTimer1);
		gTimer1 = null;
		window.status = "";
	}
}
</script>
</HEAD>
<BODY>
<form id=form1 name=form1>
<p><input type="button" value="Start Timer1" onClick="Timer1BtnStart()" id=button1 name=button1>&nbsp;
<input type="button" value="Stop Timer1" onClick="Timer1BtnStop()" id=button2 name=button2></p>
</BODY>
</HTML>