One significant hurdle to widespread adoption of large-scale quantum computing is the inherent fragility and unreliability of current technology. Google has unveiled a groundbreaking achievement in quantum error correction, potentially enabling the development of robust quantum computer systems capable of addressing complex, real-world challenges.
Quantum computing has the potential to tackle complex problems that are intractable for classical computers by leveraging the unique principles of quantum mechanics. To bring about meaningful change, we’ll need to develop processors comprising hundreds, if not tens of millions, of qubits – the quantum equivalent of classical bits.
As quantum computers traverse vast distances, a significant challenge lies in their qubits’ extraordinary unreliability. The units being highly susceptible to errors could potentially jeopardize the accuracy of calculations before a comprehensive algorithm has been fully executed.
Quantum computing companies have recently made significant strides in addressing errors, as the field’s reliability hinges on precise calculations. Google’s newly unveiled Willow quantum processor has achieved a crucial milestone, indicating that as the company’s processors scale up, their error-suppression capabilities will improve exponentially?
“Hartmut Neven, founder and lead of Google Quantum AI, asserts that this is the most compelling prototype to date for a scalable logical qubit.” “It’s a significant indication that large-scale, highly advanced quantum computers are feasible to design and build.”
Quantum error correction techniques typically operate by distributing information to be processed across multiple qubits. To mitigate errors effectively, redundant encoding is employed, ensuring that even if a single qubit fails, the encoded information can still be retrieved. By exploiting this approach, numerous “physical qubits” can be entangled to produce a solitary “logical qubit.”
The more physical qubits used to encode each logical qubit, the greater its resistance to errors. If the error rate of the person’s qubits is under 0.1% ? In most instances, the heightened risk of errors stemming from incorporating additional faulty qubits far exceeds the benefits of redundancy.
While various teams have showcased error correction methods with promising results, Google’s findings remain unparalleled in their definitiveness. Researchers in a series of experiments encoded logical qubits within increasingly larger grids, starting with a modest 3×3 array, and observed that each time they scaled up the dimensions, the error rate decreased by half. Notably, researchers found that the quantum bits they fabricated persisted for more than double the duration of their constituent classical bits.
As Neven noted, the additional qubits employed in Willow allow for a reduction in errors, thereby enabling the quantum system to more effectively transition into its desired state.
The breakthrough was facilitated by significant advancements in the fundamental superconducting qubit technology that underpins Google’s processors. In contrast, the average operating lifespan of each physical qubit in the company’s previous Sycamore processor stood at approximately 20 microseconds. With advancements in fabrication strategies and circuit optimisations, Willow’s qubits now operate at a remarkable pace of less than 68 microseconds, effectively tripling the previous processing time.
Researchers at the corporation showcased the chip’s impressive capabilities by demonstrating both its error-correction capabilities and rapid processing speed. Within just four minutes and 50 seconds, they achieved a calculation that would require the world’s second fastest supercomputer, Frontier, an astonishing 10 septillion years to complete. Despite its seeming significance, the approach they employed lacks practical application and appears artificial. While the quantum PC operates by executing arbitrary circuits without a specific goal, the classical PC must subsequently attempt to replicate its actions.
The primary focus for companies like Google is to transition from conceptual proofs to addressing commercially viable problems. The brand-new error-correction results represent a significant leap forward in the right direction, but there’s still a long way to go.
According to Julian Kelly, head of the corporate’s quantum hardware division, overcoming sensible challenges may necessitate error correction rates of approximately one in every ten million steps. Realizing this requires the creation of logical qubits comprising approximately 1,000 physical qubits each, which could potentially be reduced to several hundred qubits.
Significantly more crucially, Google’s exhibition focused solely on storing data within its quantum bits rather than leveraging them for actual computations. In September, following the posting of a preprint on arXiv, Ken Brown from Duke University drew attention to the fact that completing the required calculations could necessitate the use of a quantum computer capable of performing approximately one billion logical operations.
Despite the impressive results, a significant path still lies ahead before large-scale quantum computers can produce meaningful applications. Despite initial skepticism, Google appears to have reached a pivotal milestone that suggests this ambitious vision is finally within grasp.