bits, bytes and words


A bit (portmanteau of binary and digit) is a single unit, either a 0 or a 1. Bits logically encode physical electrical current as true (1) and false (0).

When a certain level of electricity flows through a transistor a bit is set to 1, if not enough electricity flows the bit is 0.

Imagine the USB port on your laptop, when you plug in a USB, an electrical signal is sent from the USB to the transistor on the port. When this happens the bit is set to 1 (usb_inserted = “true”). When the port is empty there is no electricity flow, its set to 0 (usb_inserted = “false”).


A byte is a digital unit that is generally a grouping of 8 bits. The internet protocol refers to an 8 bit byte as an octet.

Bits are used to encode information, above we saw that one bit can encode the boolean value for True or False. With 8 bits, there are 28 combinations of 0 and 1 which form 256 bit strings.

1 2 3 4 5 6 7 8 
0 1 0 1 0 1 0 1 # an 8 bitstring

This was historically used to represent characters as there were 256 characters including letters, numbers and punctuation. Nowadays with emojis, other languages characters and symbols, 256 is no way near enough. Generally we associate the word ‘byte’ with data size, a la megabyte or gigabyte.


In the context of CPUs, a "word" typically refers to the fixed-size unit of data that a CPU can process in a single operation. The size of a CPU word can vary depending on the computer architecture and the specific CPU being used, but it is usually a power of 2, such as 8, 16, 32, or 64 bits.

In many computer architectures, a word is the natural unit of data that the CPU operates on, and it is typically the largest size of data that can be moved between memory and the CPU in a single operation. The size of the CPU word can have a significant impact on the performance of the computer, as larger word sizes can allow for more efficient processing of certain types of data and instructions.

For example, if a computer has a 32-bit CPU, then the word size is 32 bits, and the CPU can perform arithmetic and logical operations on 32-bit data in a single operation. This means that the CPU can process 32-bit instructions and data more quickly and efficiently than it can process data that is not aligned to the word boundary.