Binary Digits

Learn about binary digits, their representation, and their implementation and processing by computers.

To understand what quantum computing is all about, we need to dig a bit deeper into the inner workings of a computer. The code you write in a high-level language is translated by another set of instructions called a compiler into code that the machinery of the computer can understand directly: This “machine language” consists of strings of 11s and 00s known as bits (“bit” is an abbreviation of “binary digit”). The binary number system has just two symbols in it, traditionally labeled 00 and 11. Our everyday number system is a decimal system, with 10 symbols: 00, 11, 22, 33, 44, 55, 66, 77, 88, and 99. It turns out that using just 00s and 11s is sufficient for almost all computational work and it greatly simplifies the design and construction of computers.

So, if you looked at the machine language version of a computer program, you would see something like 1001111000111001111110010011110001110011111100… . Not very informative to the typical human. Fortunately, the ordinary computer user (that includes you when you use your smart phone) never has to worry about the binary digits, though you often see them scrolling across a computer screen in tech movies. You only need the binary digits if you are trying to save the world from evil hackers or if you want to understand the fundamental systems that govern all traditional computers.

Get hands-on with 1400+ tech skills courses.