Let's say I have this:
char registered = '®';
or an umlaut
, or whatever unicode character. How could I get its code?
Just convert it to int
:
char registered = '®';
int code = (int) registered;
In fact there's an implicit conversion from char
to int
so you don't have to specify it explicitly as I've done above, but I would do so in this case to make it obvious what you're trying to do.
This will give the UTF-16 code unit - which is the same as the Unicode code point for any character defined in the Basic Multilingual Plane. (And only BMP characters can be represented as char
values in Java.) As Andrzej Doyle's answer says, if you want the Unicode code point from an arbitrary string, use Character.codePointAt()
.
Once you've got the UTF-16 code unit or Unicode code points, but of which are integers, it's up to you what you do with them. If you want a string representation, you need to decide exactly what kind of representation you want. (For example, if you know the value will always be in the BMP, you might want a fixed 4-digit hex representation prefixed with U+
, e.g. "U+0020"
for space.) That's beyond the scope of this question though, as we don't know what the requirements are.