On Windows, there's some basic emoji support in the console, so that I can get a monochrome glyph if I type, e.g. ☕
or 📜
. I can output a string from PowerShell or a C# console application or Python and they all show those characters fine enough.
However, from Node.js, I can only get a couple of emoji to display (e.g. ☕
), but not other (instead of 📜
I see �
). However, if I throw
a string with those characters, they display correctly.
console.log(' 📜 ☕ ');
throw ' 📜 ☕ ';
If I run the above script, the output is
� ☕ C:\Code\emojitest\emojitest.js:2 throw ' 📜 ☕ '; ^ 📜 ☕
Is there anyway that I can output those emojis correctly without throwing an error? Or is that exception happening outside of what's available to me through the standard Node.js APIs?
What you want may not be possible without a change to libuv. When you (or the
console) write to stdout
or stderr
on Windows and the stream is a TTY,
libuv does its own conversion from UTF‑8 to UTF‑16. In doing so it explicitly
refuses to output surrogate pairs, emitting instead the replacement character
U+FFFD
� for any codepoint beyond the BMP.
Here’s the culprit in uv/src/win/tty.c:
/* We wouldn't mind emitting utf-16 surrogate pairs. Too bad, the */
/* windows console doesn't really support UTF-16, so just emit the */
/* replacement character. */
if (utf8_codepoint > 0xffff) {
utf8_codepoint = UNICODE_REPLACEMENT_CHARACTER;
}
The throw
message appears correctly because Node lets Windows do the
conversion from UTF‑8 to UTF‑16 with MultiByteToWideChar()
(which does emit
surrogate pairs) before writing the message to the console. (See
PrintErrorString()
in src/node.cc.)
Note: A pull request has been submitted to resolve this issue.