Difference between codePointAt and charCodeAt

From the MDN page on charCodeAt:

The charCodeAt() method returns an integer between 0 and 65535 representing the UTF-16 code unit at the given index.

The UTF-16 code unit matches the Unicode code point for code points which can be represented in a single UTF-16 code unit. If the Unicode code point cannot be represented in a single UTF-16 code unit (because its value is greater than 0xFFFF) then the code unit returned will be the first part of a surrogate pair for the code point. If you want the entire code point value, use codePointAt().

TLDR;

  • charCodeAt() is UTF-16
  • codePointAt() is Unicode.

Leave a Comment