When you call ToByteArray()
on a GUID in .NET, the ordering of the bytes in the resulting array is not what you'd expect as compared to the string representation of the GUID. For example, for the following GUID represented as a string:
11223344-5566-7788-9900-aabbccddeeff
The result of ToByteArray()
is this:
44, 33, 22, 11, 66, 55, 88, 77, 99, 00, AA, BB, CC, DD, EE, FF
Note that the order of the first four bytes is reversed. Also bytes 4 and 5 are swapped and bytes 6 and 7 are swapped. But the final 8 bytes are in the same order they're represented as in the string.
I understand that this is occurring. What I would like to know is why .NET handles it this way.
For reference, you can see some discussion and confusion about this (incorrect attributed to Oracle databases) here and here.
If you read the Examples section from the GUID constructor, you'll find your answer:
Guid(1,2,3,new byte[]{0,1,2,3,4,5,6,7})
creates a Guid that corresponds to"00000001-0002-0003-0001-020304050607"
.
a
is a 32-bit integer, b
is a 16-bit integer, c
is a 16-bit integer, and d
is simply 8 bytes.
Because a
, b
, and c
are integer types rather than raw bytes, they are subject to endian ordering when choosing how to display them. The RFC for GUID's (RFC4122) states that they should be presented in big endian format.