
Hi:
I have built a simple page for a client that calculates a total from a bunch of items that the user enters on the page. The total is calculated by using the '+' operator. Most of the time, the addition results are correct; for instance, 1.50 + 2.50 results in 4. However, consider the following results: 1.19 + 6.45 should result in 7.64, but the result of the Javascript addition is 7.640000000000001. This is causing a problem on the page, because the client wants the total displayed in dollars and cents. In some cases, the addition results in a number such as 6.68999999999999995, and since Jscript's 'round' function rounds to the nearest integer, I lose some precision by calling 'round' on the total (the number above would be rounded to 7, which is wrong.)Simple addition on a calculator results in 6.69. I've checked the documentation for Microsoft Jscript, and ensured that the numbers I'm adding are indeed being treated as numbers, not strings etc. Why is the precision different for some addition operations? Am I missing something? Any suggestions would be much appreciated!

here is an old programming trick for cutting numbers off at a certain number of decimal places. If you want 2 places, you use a number with 2 zeros, ie 100.
num = round(num * 100) / 100;

Thanks LuZeR. I got it to round correctly using that trick, but my frustration still lies with trying to understand why 1.19 plus 6.45 is 7.630000000000001 instead of 7.64 using the Jscript '+' operator. Oh well. Ours is not to question why, ours is but to do or die.
Thanks again!