Introduction
Get a brief overview of this chapter.
In the previous chapter, we introduced bits as the representations of the states of circuit elements in classical computers. The states of those elements may be (and usually are) associated with the numerical binary digits and .
Before we jump into quantum computing, let’s look at several other ways of describing classical computing. In quantum computing we use symbolic representations that are based on the way physicists describe quantum mechanics, blended with some notation and concepts from computer science. At first, those representations look rather strange because you have not experienced them in your math, physics, or computer science courses, even though the underlying mathematics (mostly algebra, as we shall see) is not all that complicated. Bob and I thought it would be helpful to introduce this notation first within the framework of classical computing. This is not often done (one exception is [Mermin, 2007]) because the resulting representations are not useful in traditional computing. But in quantum computing, this notation is important and almost universally used. Thus, we think it is good to get used to it in a familiar context first. This approach will help us understand what appears to be a deep gulf between classical and quantum computing.
Cardy, you should know that this way of talking about classical computers is hardly ever used by anyone else; so, if you mention this language to a computer science major, you will probably get a puzzled expression in return.
Get hands-on with 1400+ tech skills courses.