Traditional Computing vs. Quantum Computing

Understand the difference between traditional and quantum computing and the challenges associated with quantum computing.

Before we get into quantum computing, we need to say a few words about traditional computing and traditional computers. Some physicists like to call this “classical” computing, as we mentioned in A Brief Preview lesson, the history of physics is often divided into a pre-quantum era called “classical physics” and the contemporary era of “quantum physics.” Classical physics is the realm of Isaac Newton and his study of motion and forces; James Clerk Maxwell, working on electricity and magnetism; and Albert Einstein, who extended those studies to objects traveling close to the speed of light. Of course, there were hundreds of others contributing to these studies, but scientists, like people in many fields, prefer focusing on a small hagiography to honor.

Quantum physics, not surprisingly, focuses on those physical phenomena where the concepts and theories of quantum mechanics are needed to understand what is going on. When talking about quantum physics, people mean the structure of atoms and molecules, nuclear physics, elementary particle physics, semiconductors, lasers, and so on.

The important point is that the classical physics world was turned on its head in the 1920s with the development of quantum theory—the theory describing the world at the atomic and subatomic levels. That theory became known as quantum mechanics and you can think of QC and QIS more generally as applications of quantum mechanics to computational and information processing situations.

Get hands-on with 1400+ tech skills courses.