How can I convert a binary-coded decimal number into a decimal number in terms of representation ? I don't want to convert the value of it but rather the representation of it, here is what I mean.
I want to convert 0x11
to decimal 11
(not 17
) and 0x20
to 20
(not 32
).
unsigned char day = 0x11;
unsigned char month = 0x12;
int dayDecimal, monthDecimal;
I want dayDecimal to be 11
and monthDecimal = 12
. I will be working with a range between 0x00 to 0x60 so it should be possible. There won't be 'A', 'B', 'C', 'D', 'E', 'F.
Update:
I am actually reading time from an RTCC chip as part of an embedded project I am working on. The hours, minutes, day, and month are returned in that form. For example if minutes are 0x40 then it means 40 minutes and not 64, so I need to able to keep the interpretation of it correctly. I need somehow to convert 0x40 into 40 and not 64. I hope that's possible.
Thanks!
You need to work with the two nybbles, multiplying the more significant nybble by ten and adding the less significant:
uint8_t hex = 0x11;
assert(((hex & 0xF0) >> 4) < 10); // More significant nybble is valid
assert((hex & 0x0F) < 10); // Less significant nybble is valid
int dec = ((hex & 0xF0) >> 4) * 10 + (hex & 0x0F);
If the assertions are disabled but the input is bogus (e.g. 0xFF), you get what you deserve: GIGO — garbage in, garbage out. You can easily wrap that into an (inline) function:
static inline int bcd_decimal(uint8_t hex)
{
assert(((hex & 0xF0) >> 4) < 10); // More significant nybble is valid
assert((hex & 0x0F) < 10); // Less significant nybble is valid
int dec = ((hex & 0xF0) >> 4) * 10 + (hex & 0x0F);
return dec;
}
This conversion is reminiscent of BCD — Binary Coded Decimal.