I am trying to write the following javascript function in python
exports.hash = u => {
u += 0xe91aaa35;
u ^= u >>> 16;
u += u << 8;
u ^= u >>> 4;
let a = ( u + ( u << 2 ) ) >>> 19;
return a ^ hashAdjust[ ( u >>> 8 ) & 0x1ff ];
};
Initially I came up with the following
def hash(u):
u += 0xe91aaa35
u ^= u >> 16
u += u << 8
u ^= u >> 4
a = (u + (u << 2)) >> 19
return a ^ hashAdjust[(u >> 8) & 0x1ff]
however I noticed the two functions return different results for large integer inputs. I did some debugging and realised its because javascript and python handle integers differently. Whilst python has unlimited precision integers, javascript uses 32 (signed) bit integers, meaning the maximum integer it can handle is 2^31 – 1, and the minimum is -(2^31 – 1), if an operation results in an integer exceeding these limits, the result gets truncated. I will be honest and admit I dont understand this completely, so I would appreciate if someone could explain this a bit more for me
my main question was how can I create a function for each of the bitwise operators (xor, left shift and right shift) used above to avoid this difference. I found one already for xor which works perfectly and kind of understood (again, would appreciate an explanation).