I am interested, which is the optimal way of calculating the number of bits set in byte by this way
template< unsigned char byte > class BITS_SET
{
public:
enum {
B0 = (byte & 0x01) ? 1:0,
B1 = (byte & 0x02) ? 1:0,
B2 = (byte & 0x04) ? 1:0,
B3 = (byte & 0x08) ? 1:0,
B4 = (byte & 0x10) ? 1:0,
B5 = (byte & 0x20) ? 1:0,
B6 = (byte & 0x40) ? 1:0,
B7 = (byte & 0x80) ? 1:0
};
public:
enum{RESULT = B0+B1+B2+B3+B4+B5+B6+B7};
};
Maybe it is optimal when value of byte is known at run-time? Is it recommended use this in code?
For one byte of data, the optimal way considering both speed and memory consumption:
uint8_t count_ones (uint8_t byte)
{
static const uint8_t NIBBLE_LOOKUP [16] =
{
0, 1, 1, 2, 1, 2, 2, 3,
1, 2, 2, 3, 2, 3, 3, 4
};
return NIBBLE_LOOKUP[byte & 0x0F] + NIBBLE_LOOKUP[byte >> 4];
}
Calling this function from a for loop should yield quite an efficient program on most systems. And it is very generic.