Quantum computers exploit the seemingly bizarre yet proven nature of the universe that until a particle interacts with another, its position, speed, color, spin and other quantum properties coexist simultaneously as a probability distribution over all possibilities in a state known as superposition. Quantum computers use isolated particles as their most basic building blocks, relying on any one of these quantum properties to represent the state of a quantum bit (or “qubit”). So while classical computer bits always exist in a mutually exclusive state of either 0 (low energy) or 1 (high energy), qubits in superposition coexist simultaneously in both states as 0 and 1.
Things get interesting at a larger scale, as QC systems are capable of isolating a group of entangled particles, which all share a single state of superposition. While a single qubit coexists in two states, a set of eight entangled qubits (or “8Q”), for example, simultaneously occupies all 2^8 (or 256) possible states, effectively processing all these states in parallel. It would take 57Q (representing 2^57 parallel states) for a QC to outperform even the world’s strongest classical supercomputer. A 64Q computer would surpass it by 100x (clearly achieving quantum advantage) and a 128Q computer would surpass it a quintillion times.
In the race to develop these computers, nature has inserted two major speed bumps. First, isolated quantum particles are highly unstable, and so quantum circuits must execute within extremely short periods of coherence. Second, measuring the output energy level of subatomic qubits requires extreme levels of accuracy that tiny deviations commonly thwart. Informed by university research, leading QC companies like IBM, Google, Honeywell and Rigetti develop quantum engineering and error-correction methods to overcome these challenges as they scale the number of qubits they can process.
Following the challenge to create working hardware, software must be developed to harvest the benefits of parallelism even though we cannot see what is happening inside a quantum circuit without losing superposition. When we measure the output value of a quantum circuit’s entangled qubits, the superposition collapses into just one of the many possible outcomes. Sometimes, though, the output yields clues that qubits weirdly interfered with themselves (that is, with their probabilistic counterparts) inside the circuit.
QC scientists at UC Berkeley, University of Toronto, University of Waterloo, UT Sydney and elsewhere are now developing a fundamentally new class of algorithms that detect the absence or presence of interference patterns in QC output to cleverly glean information about what happened inside.
A fully functional QC must, therefore, incorporate several layers of a novel technology stack, incorporating both hardware and software components. At the top of the stack sits the application software for solving problems in chemistry, logistics, etc. The application typically makes API calls to a software layer beneath it (loosely referred to as a “compiler”) that translates function calls into circuits to implement them. Beneath the compiler sits a classical computer that feeds circuit changes and inputs to the Quantum Processing Unit (QPU) beneath it. The QPU typically has an error-correction layer, an analog processing unit to transmit analog inputs to the quantum circuit and measure its analog outputs, and the quantum processor itself, which houses the isolated, entangled particles.
As we approach the age of quantum computing, it is no longer a question of ‘if,’ but rather one of ‘when’ this technology finally matures and ‘who’ will lead this emerging industry.