For a bit mask it helps to look at values in binary since that is the level needed for a bit mask. And each enum value typically only sets a single bit.
So the enum values would be set (in binary) to 00001
, 00010
, 00100
, 01000
, 10000
, etc.
Those same values in decimal would be: 1
, 2
, 4
, 8
, 16
, etc.
And in hex they would be 0x01
, 0x02
, 0x04
, 0x08
, 0x10
, etc.
It’s really just a matter of preference but since hexadecimal is a power of 2, it better relates to binary than decimal does. This makes it slightly clearer that the values represent bit mask values.
solved hexadecimal in typedef enum in C