I apologise if I have a simplistic view of this but hear me out.
If I take a variable of value 1 and divide that by 3; in javascript it will give me the result 0.3333333333333333.
If I then multiply that value by 3 then it will give me 1 again. Now fair enough if something is divided by 3 and then multiplied by 3 then you should get the same value.
However why does this not instead give me 0.9999999999999999? (Ignoring any potential floating-point errors).
Can’t we have a ‘recurrance’ character to say this is a recurring value instead of what it looks like?
I.e. Have something like 0.333333· instead of 0.333333
I just feel like this would be a more explicit way of saying it’s not the exact value shown.
Actually the repeating decimal 0.999999… is exactly equal to 1, and this can be proven a number of ways mathematically. So maybe that might be the basis for why JavaScript behaves the was it does.
My only personal knowledge with maths in this area was to put a sort’ve ellipsis above the final digit to show it was a recurring number as opposed to being specifically that number (However I realise this could be a preference of the teacher at the time and not actually be a mathematical notation)
So it’s actually correct and there’s no need for an explicit way to show it’s recurring?
I don’t think so. In for most practical purposes in real life and science and programming, repeating decimals really aren’t useful. All numbers are rounded off to the proper number of significant digits in calculations.