At CES 2025, Nvidia CEO Jensen Huang predicted practical quantum computing was two decades away. Just weeks later, Google’s quantum lead, Hartmut Neven, told Reuters it could be a reality within five years. This discrepancy raises a crucial question: who’s right about the timeline for practical quantum computing?
Huang’s argument centers on the current lack of qubits, the fundamental units of quantum information. He estimates we’re five or six orders of magnitude short of the required number. This perceived shortage stems from the relationship between qubit quantity and computational accuracy. Current research indicates that more qubits lead to fewer errors, a crucial factor for reliable quantum computations.
A qubit, or quantum bit, unlike its binary counterpart in classical computers, can hold multiple states simultaneously. This capability stems from its quantum nature, allowing for vastly more complex computations. However, the inherent instability of quantum particles introduces a challenge. Qubits are prone to errors, disrupting calculations. Current estimates suggest a failure rate of approximately one in every thousand qubits.
This challenge mirrors early computing struggles. The ENIAC, with its thousands of vacuum tubes, suffered frequent failures. The solution then was simpler: replace the unreliable component. Transistors provided a more stable alternative, significantly reducing error rates.
Unfortunately, this approach doesn’t apply to quantum computing. Qubits are inherently quantum particles. We can’t simply replace them with something more stable. Instead, we must find ways to mitigate their inherent instability.
This is where the quantity of qubits becomes critical. Google’s research with the Willow quantum chip demonstrated that combining multiple physical qubits into “mega qubits” improves error resilience. These mega qubits share the same data, providing redundancy. If one physical qubit fails, the others maintain the information, effectively creating a failsafe system. The more physical qubits within a mega qubit, the greater the tolerance for errors, and the higher the likelihood of accurate results.
Given the high error rate of individual qubits and the need for high accuracy in practical applications, a substantial number of qubits are required. Huang’s 20-year prediction reflects this challenge, while Neven’s five-year projection suggests a more optimistic outlook.
This difference in projections raises several questions. Does Google possess some undisclosed advantage? Or is this simply competitive posturing? Following Huang’s comments, quantum computing stocks saw a significant drop. Neven’s more optimistic view could be an attempt to revitalize the market.
Regardless of the timeline, Google envisions quantum computing revolutionizing diverse fields. From improved batteries for electric vehicles and novel drug development to groundbreaking energy solutions, the potential applications are vast. Whether these advancements are achievable within five years remains to be seen, but time will ultimately reveal the accuracy of Neven’s bold prediction.