Why use the Bitwise-Shift operator for values in a C enum definition?

Dave picture Dave · Oct 22, 2010 · Viewed 21.6k times · Source

Apple sometimes uses the Bitwise-Shift operator in their enum definitions. For example, in the CGDirectDisplay.h file which is part of Core Graphics:

enum {
  kCGDisplayBeginConfigurationFlag  = (1 << 0),
  kCGDisplayMovedFlag           = (1 << 1),
  kCGDisplaySetMainFlag         = (1 << 2),
  kCGDisplaySetModeFlag         = (1 << 3),
  kCGDisplayAddFlag         = (1 << 4),
  kCGDisplayRemoveFlag          = (1 << 5),
  kCGDisplayEnabledFlag         = (1 << 8),
  kCGDisplayDisabledFlag        = (1 << 9),
  kCGDisplayMirrorFlag          = (1 << 10),
  kCGDisplayUnMirrorFlag        = (1 << 11),
  kCGDisplayDesktopShapeChangedFlag = (1 << 12)
};
typedef uint32_t CGDisplayChangeSummaryFlags;

Why not simply use incrementing int's like in a "normal" enum?

Answer

pmg picture pmg · Oct 22, 2010

Maybe writing the values in hexadecimal (or binary) helps :-)

enum {
  kCGDisplayBeginConfigurationFlag  = (1 << 0), /* 0b0000000000000001 */
  kCGDisplayMovedFlag               = (1 << 1), /* 0b0000000000000010 */
  kCGDisplaySetMainFlag             = (1 << 2), /* 0b0000000000000100 */
  kCGDisplaySetModeFlag             = (1 << 3), /* 0b0000000000001000 */
  kCGDisplayAddFlag                 = (1 << 4), /* 0b0000000000010000 */
  kCGDisplayRemoveFlag              = (1 << 5), /* 0b0000000000100000 */
  kCGDisplayEnabledFlag             = (1 << 8), /* 0b0000000100000000 */
  kCGDisplayDisabledFlag            = (1 << 9), /* 0b0000001000000000 */
  kCGDisplayMirrorFlag              = (1 << 10),/* 0b0000010000000000 */
  kCGDisplayUnMirrorFlag            = (1 << 11),/* 0b0000100000000000 */
  kCGDisplayDesktopShapeChangedFlag = (1 << 12) /* 0b0001000000000000 */
};

Now you can add them (or "or" them) and get different values

kCGDisplayAddFlag | kCGDisplayDisabledFlag /* 0b0000001000010000 */