You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
JavaScript stores all numbers as double-precision binary IEEE 754 floating point values. Each number is internally stored on 64 bits: 1 bit for sign, 52 for mantissa and 11 for exponent. Given that it has limited number of bits available, it can therefore represent only a limited number of different values. Integers are just a small subset of that.
The maximal integer you can safely represent (without losing the precision) is Number.MAX_SAFE_INTEGER, whose value is 9007199254740991. You are trying to represent 36028797018963973, which is larger than max safe integer, so you lost precision. That is expected.
To solve the problem, you can use BigInt (simply add n to the end of your number).
As @Hakerh400 explained, this is a known deficiency of the number type in JavaScript. Representing your number as a binary sequence requires 56 bits, but the JavaScript number type cannot represent integers with more than 52 bits accurately. Sadly, the only true integer type provided by JavaScript is BigInt.
This is a common problem and appears to have been answered sufficiently.
Issue is simple as
how is that?
The text was updated successfully, but these errors were encountered: