Why don't we use a symbol for recurring numbers in code?

I apologise if I have a simplistic view of this but hear me out.

If I take a variable of value 1 and divide that by 3; in javascript it will give me the result 0.3333333333333333.
If I then multiply that value by 3 then it will give me 1 again. Now fair enough if something is divided by 3 and then multiplied by 3 then you should get the same value.
However why does this not instead give me 0.9999999999999999? (Ignoring any potential floating-point errors).
Can’t we have a ‘recurrance’ character to say this is a recurring value instead of what it looks like?
I.e. Have something like 0.333333· instead of 0.333333

I just feel like this would be a more explicit way of saying it’s not the exact value shown.

I’ve tried to demonstrate what I’m trying to get at here http://plnkr.co/edit/JmbXN4lIZvquERHWv447?p=preview

Actually the repeating decimal 0.999999… is exactly equal to 1, and this can be proven a number of ways mathematically. So maybe that might be the basis for why JavaScript behaves the was it does.

My only personal knowledge with maths in this area was to put a sort’ve ellipsis above the final digit to show it was a recurring number as opposed to being specifically that number (However I realise this could be a preference of the teacher at the time and not actually be a mathematical notation)

So it’s actually correct and there’s no need for an explicit way to show it’s recurring?

I don’t think so. In for most practical purposes in real life and science and programming, repeating decimals really aren’t useful. All numbers are rounded off to the proper number of significant digits in calculations.

Many years ago when I was in school we used a vinculum for rational repeating decimals.

But work was done on paper and the use of pocket calculators was forbidden (my, how things have changed!)

Because computers work in base 2 and most mathematics are base 10, there are precision issues.

PHP has “precision math” as I imagine many other languages do, which can be useful when better precision is needed, but still has a limit,

But how often is such precision really absolutely necessary?

As WebMachine posted, repeating decimals aren’t all that useful.

But could something be used to indicate that a number had been rounded?


π ≈ 3.14159265

1 / 3 = .3̅3̅3̅3̅3

I like the idea, but AFAIK it doesn’t exist. Maybe because there has not been enough of a demand for it?


This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.