1.2 + 1.1 may be ok but 0.2 + 0.1 may not be ok.
This is a problem in virtually every language that is in use today. The problem is that 1/10 cannot be accurately represented as a binary fraction just like 1/3 cannot be represented as a decimal fraction.
The workarounds include rounding to only the number of decimal places that you need and either work with strings, which are accurate:
(0.2 + 0.1).toFixed(4) === 0.3.toFixed(4) // true
or you can convert it to numbers after that:
+(0.2 + 0.1).toFixed(4) === 0.3 // true
or using Math.round:
Math.round(0.2 * X + 0.1 * X) / X === 0.3 // true
where X
is some power of 10 e.g. 100 or 10000 - depending on what precision you need.
Or you can use cents instead of dollars when counting money:
cents = 1499; // $14.99
That way you only work with integers and you don't have to worry about decimal and binary fractions at all.
2017 Update
The situation of representing numbers in JavaScript may be a little bit more complicated than it used to. It used to be the case that we had only one numeric type in JavaScript:
This is no longer the case - not only there are currently more numerical types in JavaScript today, more are on the way, including a proposal to add arbitrary-precision integers to ECMAScript, and hopefully, arbitrary-precision decimals will follow - see this answer for details:
See also
Another relevant answer with some examples of how to handle the calculations:
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…