I am trying to understand how bit-wise operation in JavaScript work, more specifically how the 32 bit number resulting from a bit-wise operation is converted back to a 64 bit JavaScript number. I am getting some strange results when setting the left most bit in a 32 bit number and when the operation overflows.
For example, with the following operation:
0x01 << 31
Would normally result in 0x80000000
if the number was 32 bits long. But when JavaScript converts this number back to a 64 bit value, it padds the leftmost 32 bits with 1
resulting in the value FFFFFFFF80000000
.
Similarly, when left shifting 32 bits, thus overflowing a 32 bit integer, with the operation:
0x02 << 32
The number would overflow, and the result value should be 0x00
. But the resulting JavaScript number is 0x02
.
Are there any specific rules that JavaScript uses for bit-wise operation that I am not aware of? I understand that all bit-wise operations are performed with 32 bit integers, and that JavaScript numbers are 64 bit double precision floating point numbers, but I cannot understand where the extra padding comes from when converting between the two.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…