I'm trying to display a number as a percent by using _.round
and then by multiplying the number by 100. For some reason, when I multiply the rounded number, the precision gets messed up. Here's what it looks like:
var num = 0.056789,
roundingPrecision = 4,
roundedNum = _.round(num, roundingPrecision),
percent = (roundedNum * 100) + '%';
console.log(roundedNum); // 0.0568
console.log(percent); // 5.680000000000001%
Why is the 0.000000000000001 added to the number after multiplying by 100?
This is due to the fact that numbers are represented internally as binary numbers with limited precision.
See also "Is floating point math broken?"
Is floating point math broken?
0.1 + 0.2 == 0.3 -> false
0.1 + 0.2 -> 0.30000000000000004
Any ideas why this happens?
Which got the answer:
Binary floating point math is like this. In most programming languages, it is based on the IEEE 754 standard. JavaScript uses 64-bit floating point representation, which is the same as Java's
double
. The crux of the problem is that numbers are represented in this format as a whole number times a power of two; rational numbers (such as0.1
, which is1/10
) whose denominator is not a power of two cannot be exactly represented.
To get the correct outcome in your case, you need to round after all the arithmetic:
var num = 0.056789,
roundingPrecision = 4,
roundedNum = _.round(num * 100, roundingPrecision),
percent = roundedNum + '%';
console.log(percent); // 5.0569%
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/3.10.1/lodash.min.js"></script>