The JavaScript Number Type

Like most things in JavaScript, mysteries are wrapped in enigmas. Just kidding. But it is true that some things are not how they appear.

I had an interview once at a stinky bank (what was I thinking?). One of the technical questions was something along the lines of "does 2.031 === (1.001 + 1.03) evaluate to true or false." Of course the answer is false. It evaluates to 2.0309999999999997. So don't trust JavaScript with numbers, or bankers with cold sweaty hands. This is an issue in all programming languages.

Unlike other languages like Java and C++, JavaScript only has one type to represent a number. There are no Floats, Doubles, Long - just Number. Numbers (and strings) are actually objects, objects that wrap around the underlying value. You could create a number by calling the constructor function on the Number type: var num = new Number(5). I wouldn't recommend this though, let JavaScript do the hard work for you.

JavaScript uses a double precision 64-bit format which represents numbers from −9007199254740992 (−253) and 9007199254740992 (253). Anything greater or lesser than these evaluates to Infinity.

SignExponentBase
bit, 1 or 0 11 bits to store the exponent, as a power of two 52 bits to store the base

Resources