I'm trying to convert a BCD to ascii and vice versa and saw a solution similar to this while browsing, but don't fully understand it. Could someone explain it?
void BCD_Ascii(unsigned char src, char *dest) {
outputs = "0123456789"
*dest++ = outputs[src>>4];
*dest++ = outputs[src&0xf];
*dest = '\0';
}
The function converts a character in binary-coded decimal into a string.
First the upper 4 bits of src
are obtained:
src>>4
The function then assumes the values those bits represent are in the range 0-9. Then that value is used to get an index in the string literal outputs
:
outputs[src>>4];
The value is written into address which is pointed to by dest
. This pointer is then incremented.
*dest++ = outputs[src>>4];
Then the lower 4 bits of src
are used:
src&0xf
Again assuming the values of those bits, are representing a value in range 0-9. And the rest is the same as before:
*dest++ = outputs[src&0xf];
Finally a 0 is written into dest
, to terminate it.