Why does console.log(65 == ‘A’) output false in JavaScript?

console.log(65 == ‘A’) // => false: why the output is false in JavaScript.

In the ASCII character set, the value of ‘A’ is 65. When I try to compare 65 and ‘A’ in C, the expression will evaluate to 1, meaning “true.” But in JavaScript, the case is different because I think JavaScript uses the Unicode character set, and every character in Unicode represents itself as a hexadecimal code. Please guide me.