A computer that runs on weirdness
Recently, improbably, quantum physics and particularly quantum computing has been in the news.
Not so long ago, quantum computers seemed about as futuristic as warp drive or cold fusion. But now they're getting closer and closer to becoming a practical reality. The B.C. company D-Wave recently got a large investment from some big players, including Google and NASA, for their quantum computing technology.
The words "quantum physics" can be intimidating, and even the basics can be a little mind bending. The Danish physicist Niels Bohr said that, "If you can fathom quantum mechanics without getting dizzy, you don't get it".
So maybe you should take a seat, because as CBC radio's tech show, we thought we better roll up our sleeves and see if we could clear this up a little.
To help us, we spoke to some people who spend their time figuring how how to make quantum computers work. Scott Aaronson is an associate professor of electrical engineering and computer science at MIT. We also went to visit the University of Toronto's Centre for Quantum Information and Quantum Control, where we spoke to Aephraim Steinberg, an experimentalist and professor of physics.
To start from the beginning, in its simplest form, what is a quantum computer?
Scott says that "...a quantum computer is a proposed machine that would exploit the laws of quantum physics." The particular quirk of quantum physics that quantum computers exploit is superposition: the ability for an object to exist in several states at the same time. "Instead of describing the state of the world in terms of where every object is" Aephraim says, "we instead realize that things might not have definite positions. they might be in an uncertain position, spread out over some probability distribution."
For the computers we use ("classical computers"), the basic piece of information is a bit. Really, these are just the results of switches, called transistors, being turned off, or on. These end up as either the zeros, or the ones that make up binary code. Scott explains that "...each bit in your computer's memory is at any given time either definitely a one or definitely a zeo." But if a bit could be both a one and a zero, or any degree of somewhere in between, that hugely increases the computer's efficiency. In quantum computing, these "quantum bits" are called "qubits".
While a qubit is the equivalent of a classical bit, the quantum version of a transistor is harder to pin down. Aephraim says that "...there are probably half a dozen serious contenders for how you could build a quantum computer." One major one is the ion trap, where the particular spin state of an ion produces the quantum bit of information. Another involves super cooled circuits, and another uses the polarization states of photons of light. But despite these protoypes and possibilities, Scott says that "...the quantum computing analogue of the transistor has probably not been discovered yet."
The question now is, what are these quantum computers good for? Are we looking at a future where we will toss out our old laptops to make space for these new better, quantum computers, or are they really just specialized pieces of equipment?
Scott says that "...a quantum computer is not going to help you compose an email or play angry birds." But they do have a variety of uses, like breaking encryption, special kinds of searches, and simulating natural phenomena, among others.
Despite how specialized and quirky quantum computers may be, Aephraim still suggests they could still have a major impact on our lives. "One of the reasons computers are so important to our lives is not that you and I are using our computers all the time, but that everything that we use was designed using other computers. If not for computers we would not be able to design the planes and the cars that we have today. We would not be able to design or build the computers we use today. So even if the technology did nothing but simulate quantum mechanical systems, it would be invisible to you and me day to day, but it would underlie all the technologies we use in another twenty or thirty years."