Should a buffer of bytes be signed char or unsigned char or simply a char buffer? Any differences between C and C++?
Thanks.
If you intend to store arbitrary binary data, you should use unsigned char
. It is the only data type that is guaranteed to have no padding bits by the C Standard. Each other data type may contain padding bits in its object representation (that is the one that contains all bits of an object, instead of only those that determines a value). The padding bits' state is unspecified and are not used to store values. So if you read using char
some binary data, things would be cut down to the value range of a char (by interpreting only the value bits), but there may still be bits that are just ignored but still are there and read by memcpy
. Much like padding bits in real struct objects. Type unsigned char
is guaranteed to not contain those. That follows from 5.2.4.2.1/2
(C99 TC2, n1124 here):
If the value of an object of type char is treated as a signed integer when used in an expression, the value of
CHAR_MIN
shall be the same as that ofSCHAR_MIN
and the value ofCHAR_MAX
shall be the same as that ofSCHAR_MAX
. Otherwise, the value ofCHAR_MIN
shall be 0 and the value ofCHAR_MAX
shall be the same as that ofUCHAR_MAX
. The valueUCHAR_MAX
shall equal2^CHAR_BIT − 1
From the last sentence it follows that there is no space left for any padding bits. If you use char
as the type of your buffer, you also have the problem of overflows: Assigning any value explicitly to one such element which is in the range of 8
bits - so you may expect such assignment to be OK - but not within the range of a char
, which is CHAR_MIN
..CHAR_MAX
, such a conversion overflows and causes implementation defined results, including raise of signals.
Even if any problems regarding the above would probably not show in real implementations (would be a very poor quality of implementation), you are best to use the right type from the beginning onwards, which is unsigned char
.
For strings, however, the data type of choice is char
, which will be understood by string and print functions. Using signed char
for these purposes looks like a wrong decision to me.
For further information, read this proposal
which contain a fix for a next version of the C Standard which eventually will require signed char
not have any padding bits either. It's already incorporated into the working paper.