What’s the Big Deal About Quantum Computing?

Find out why you should take this course, its general layout, and the concepts of quantum computing and quantum information processing explored in it.

Welcome, Cardy.

Hi, Alice and Bob. Thanks for inviting me over. I heard that you two had figured out some simple ways of understanding why quantum computing (QC) and quantum information science (QIS) are important and can explain all of that to a non-expert like me. I saw your note in the student newspaper about wanting someone to critique your new book manuscript. That’s why I called. As a business and English major, I want to find out more about these subjects. I keep reading that they are going to be the next “big thing” in the business world and that means opportunities for building new businesses and developing new innovations. Even our School of Business has had a few talks about quantum computing. Besides, it would be cool to wow my friends with my knowledge of quantum computing, entanglement, and all the stuff I’ve read about online.

In fact, I read about you and Bob in a blog post about quantum computing, and it is a great pleasure to meet you in person.

I hate to disappoint you, Cardy, but Alice and I are not the actual characters whose message-sending is described in both popular books and technical papers about quantum information, cryptography, and quantum computing. In fact, “Alice and Bob” first appeared in print in 1978 in a cryptography article in the technical journal Communications of the Association for Computing Machinery.

I remember my parents talking about a 1969 movie Bob & Carol & Ted & Alice. Was that you guys?

Alas, no again. But the common mention of our names in quantum information and quantum computing books was part of the motivation for our putting together these materials. What better way to learn about quantum computing than from Alice and Bob!

Bob and I decided that we ought to find a way to introduce people who aren’t experts in quantum mechanics, linear algebra, and computer science to the key ideas and vocabulary of QC and QIS. There are lots of books that take a fairly high-level approach, which is fine for math and physics majors but leaves most folks baffled. There are also several books that avoid math entirely and focus on general concepts, often with idiosyncratic models and weird terminology that don’t make contact with how most scientists, engineers, and mathematicians talk about quantum computers. We worked hard to find the “sweet spot” between insider jargon wrapped in unfamiliar mathematics and fluffy treatments that seem strange and inadequate to both experts and novices. Since both of us struggled to penetrate the many layers of jargon that expert authors have brought to the field, we felt we were in a good position to find that sweet spot. Jargon is fine of course for the experts, but many authors use it mainly to show that they are part of the “tribe” and to separate the tribe from outsiders.

Speaking of confusing jargon, what’s the difference between quantum computing and quantum information science? To me, they sound the same.

Thanks for calling us out! We need to practice what we preach. Please do that whenever we get carried away like that. Quantum information science has become the general term for the use of quantum principles and technology for the transfer of information (what you might call quantum communications), the processing of information (QC), and for the sensitive detection of various kinds of physical properties (quantum sensing). People are already talking about building a quantum internet to enhance the security of information exchange. In this course we focus on the first two—QIS and QC. To appreciate quantum sensing, you need some background in physics and how quantum systems interact with their environment.

As we said in our email messages, we’ll be delighted to spend several days with you. We hope that your background in English will help you tell us where our presentation is muddled or confusing.

Of course, quantum computing is built on the basic principles of quantum mechanics and there are many parts of quantum mechanics that are conceptually quite challenging for people new to the field and to experts. Even though scientists and mathematicians know how to do quantum calculations, there is still a lot of controversy about what those calculations tell us about the nature of reality. We have tried to develop quantum computing ideas in a way that avoids, at least for a while, those conceptual challenges, but we will need to deal with them eventually. And, in their own way, they are rather cool challenges.

I am a bit worried about my math preparation for quantum computing. I keep hearing about linear algebra, Hilbert spaces, operators, state vectors, complex numbers, and the like—none of which I have studied in my math courses.

I hope you will find that Alice and I have been careful to minimize the formal mathematics in our story of quantum computing by choosing examples carefully and avoiding unnecessary mathematical jargon. Of course, the math is important. Every community has its own language and for the community of physicists, engineers, and computer scientists, mathematics is the lingua franca.

To understand how quantum computing works, we are going to need some math to help us with the reasoning. Fortunately, all you will need is some simple algebra, basic concepts of vectors (which we will help you with pictorially), and a few sines and cosines—the latter in fact appear only in only a few places in this course. However, since we want you to learn to “read” the equations of quantum computing, we have used the symbols that are commonly deployed in the field. Those symbols were borrowed from quantum mechanics, and they will look strange when you first meet them. But if you are patient and practice writing and reading those symbols, you will get used to them.

Let me warn you that the experts may be annoyed by our approach because our examples do simplify the math. For example, ordinary quantum mechanics makes use of complex numbers and we have found a way to avoid them for most of this course. Our approach is to help you understand the basic ideas and prepare you to then move into the more complex mathematics if you so choose.

Now that you mention it, I do recall one of my math teachers in high school saying something about real and imaginary numbers. I never did figure out what an imaginary number means.

It turns out that we can introduce the basics of quantum computing with only ordinary (“real”) numbers. In this course, we will show you why we need complex numbers in some aspects of quantum information science, but they are not essential for the main ideas. There is also some mathematical jargon we will mention only in passing in case you read about those ideas in the quantum computing literature. We’ll do our best to avoid excess jargon as much as possible.