[Solved] Bit / Byte Confusion on Binary Digits [closed]


These days, the number of bits per character is not so simple but there are most definitely 8 bits per byte.

The number of bits/bytes per character will vary depending on your system and the character set.

Original ASCII was 7bits per character.

That was extended to 8bits per character and stayed there for a very long time.

These days, with Unicode, you have variants that are 8, 16 and 32 bits (1, 2 and 4 bytes) wide. Were the 8 bit variety corresponds to ASCII.

The character 1 is represented by binary 00110001b which is 8 bits thus 1byte. The actual binary number 1b would be at a minimum 00000001b which represents a special character not the character 1.

Hopefully this helps.

solved Bit / Byte Confusion on Binary Digits [closed]