Does anybody know how to do an accurate timer?
All timer scripts I see use a system similar to the example below.
The example is an html file with just one timer in it that shows in the status bar.
Let it run for one minute and you'll notice the timer indicates 57 when it should be at 60... (at least on my system)
So 1000 tics does not equal 1 second, it's actually a little bit too slow, or more specifically, it's just not accurate...
It seems to be a rounding error because when using 60 000 tics, it does update the count by one at exactly the right second after one minute...
var gTimer1 = null;
var gTimer1Count = -1;
window.status = "Timer1 Running - " + gTimer1Count;
gTimer1 = window.setTimeout("Timer1()",1000);
gTimer1 = null;
window.status = "";
<form id=form1 name=form1>
<p><input type="button" value="Start Timer1" onClick="Timer1BtnStart()" id=button1 name=button1>
<input type="button" value="Stop Timer1" onClick="Timer1BtnStop()" id=button2 name=button2></p>