Can anyone explain this bitwise operation syntax?
#define Bitset(var,bitno) ((var) |=1UL<<(bitno))
I know it sets the bits of var
, but I can't understand the syntax.
Let's break it down, piece by piece:
1UL
is an unsigned long int
with a value of 1 represented at the bit level as:
00000000000000000000000000000001
the <<
is a "bit shift" operator which will move all the bits in that value above to the left bitno
number of times. If it's 1UL<<5
, you'll end up with:
00000000000000000000000000100000
Once you have this value, the |=
(which is a bitwise OR operation with an assignment) will essentially force the bit of var
that's in line with that 1
to be a 1
and wont touch any other bits because (X | 0 = X
)
Lets say var
is 37
and bitno
is 7
. Then everything at the bit level will look like this:
00000000000000000000000000100101 // var
00000000000000000000000010000000 // 1UL<<7
00000000000000000000000010100101 // var | (1UL<<7)
Finally, in case it isn't clear, the #define
marks Bitset
as a function-like macro.