• Home  
  • Elon Musk’s AI Expert Witness Fears AGI Arms Race
- Artificial Intelligence

Elon Musk’s AI Expert Witness Fears AGI Arms Race

Stuart Russell, a leading AI researcher, fears an ‘arms race’ between nations over advanced AI.

Elon Musk's AI Expert Witness Fears AGI Arms Race

The Race to Quantum Supremacy: IBM’s 1,121-Qubit Condor Chip

In late 2023, IBM unveiled its most powerful quantum processor to date: the 1,121-qubit Condor chip. Built using silicon-based superconducting circuits, Condor marks a significant leap in qubit count — nearly tripling the 433-qubit Osprey processor released just a year earlier. This milestone wasn’t reached in isolation. It’s part of IBM’s clearly defined quantum roadmap, which began in 2020 with the 65-qubit Hummingbird chip and aims to deliver a 100,000-qubit system by 2033. The Condor processor, while impressive in scale, highlights a central tension in quantum computing: raw qubit numbers don’t necessarily equate to practical utility. Many of these qubits are still prone to errors, and without sufficient error correction, large-scale quantum advantage remains out of reach.

Alongside Condor, IBM introduced Heron, a 133-qubit processor with a markedly lower error rate. Heron’s design prioritizes quality over quantity. Its two-qubit gate error rate sits at approximately 0.6%, a notable improvement over previous generations. More importantly, Heron features tunable couplers — circuit elements that allow qubits to be selectively connected or isolated. This reduces crosstalk, a major source of noise in superconducting systems, and improves gate fidelity. IBM also debuted a new modular architecture, linking multiple Heron chips via superconducting bridges. This approach moves away from monolithic chips toward a scalable, interconnected model — a necessary evolution as physical limits constrain single-die performance.

Why Qubit Count Alone Misleads

It’s tempting to view IBM’s jump from 433 to 1,121 qubits as a straightforward advancement. But quantum computing isn’t like classical computing, where more transistors reliably deliver better performance. In quantum systems, the value of a qubit depends on coherence time, gate fidelity, connectivity, and error rates. Condor’s 1,121 qubits operate with relatively high error rates, limiting their usefulness for complex algorithms. Without quantum error correction (QEC), most of these qubits can’t maintain stable states long enough to complete meaningful computations.

Practical quantum advantage — the point at which a quantum computer outperforms the best classical supercomputers on real-world problems — requires not just thousands of physical qubits, but millions when accounting for error correction. Current QEC schemes, like the surface code, demand around 1,000 physical qubits to create a single, stable “logical qubit.” By that math, Condor’s entire chip might yield just one or two usable logical qubits. That’s why IBM’s simultaneous focus on Heron’s improved fidelity matters. High-fidelity qubits reduce the overhead needed for error correction, making the path to logical qubits more efficient. Companies like Google and Quantinuum are already demonstrating small logical qubit arrays with lower error rates than their physical components — a sign that the field is inching toward scalable QEC. IBM’s strategy now hinges on balancing scale and quality, knowing that neither alone will deliver practical quantum computers.

Competing Visions: Trapped Ions, Photonics, and Superconducting Alternatives

IBM isn’t the only player pushing quantum hardware forward. Competing technologies are advancing rapidly, each with distinct trade-offs. Quantinuum, formed from the merger of Honeywell Quantum Solutions and Cambridge Quantum, uses trapped ion technology. Their H2 processor, released in 2023, has only 32 qubits but achieves two-qubit gate fidelities above 99.8% — significantly higher than IBM’s or Google’s superconducting systems. Trapped ions benefit from long coherence times and natural qubit connectivity, but they operate at slower gate speeds and are harder to scale due to the complexity of laser control systems.

Meanwhile, PsiQuantum, a Silicon Valley startup backed by over $700 million in venture funding, is betting on photonic quantum computing. Their approach uses particles of light (photons) as qubits, manipulated through integrated optical circuits. Photonics offer inherent advantages: they operate at room temperature and are less susceptible to electromagnetic interference. PsiQuantum claims it can build a million-qubit machine using semiconductor manufacturing techniques similar to those used for classical chips. However, the company has yet to release a public demonstration of even a small-scale processor, making their timeline uncertain. Amazon’s AWS is also investing heavily, funding research across multiple platforms through its Amazon Braket service, which provides cloud access to quantum hardware from IonQ, Rigetti, and Oxford Quantum Circuits.

Among superconducting rivals, Google remains IBM’s closest competitor. Their 70-qubit Sycamore processor achieved quantum supremacy in 2019 by performing a specific sampling task in 200 seconds — a calculation they estimated would take a classical supercomputer 10,000 years. Since then, Google has focused on error correction and modular designs, much like IBM. In 2023, they demonstrated a logical qubit with an error rate lower than that of its constituent physical qubits, a key milestone. Rigetti Computing, another U.S.-based firm, is pursuing smaller-scale, customizable quantum chips for niche applications in defense and finance, but their largest system, the 84-qubit Aspen-M-3, lags behind IBM and Google in scale. The global race isn’t limited to the U.S.; China has invested heavily in quantum research, with teams at the University of Science and Technology of China (USTC) demonstrating photonic quantum advantage with their Jiuzhang machines and advancing superconducting systems like Zuchongzhi-2, a 66-qubit processor.

The Infrastructure and Cooling Challenge

Quantum processors don’t operate on desktops. They require extreme environments to function. Superconducting qubits, like those in IBM’s Condor and Heron, must be cooled to temperatures near absolute zero — typically around 15 millikelvin, colder than deep space. This is achieved using dilution refrigerators, complex cryogenic systems that can cost over $500,000 and require specialized facilities. As qubit counts increase, so do the demands on cooling, wiring, and signal control. Each qubit needs multiple control lines for microwave pulses and readout, creating a “wiring bottleneck” as chips scale.

IBM is addressing this with its quantum system two architecture, introduced in 2023. This modular design stacks multiple processors inside a single refrigerator, connected via superconducting bridges that allow quantum states to be transferred between chips. This approach reduces the need for external cabling and improves qubit connectivity. Other companies are exploring alternatives. Google uses similar multi-chip modules in its Sycamore systems, while Rigetti integrates control electronics closer to the quantum chip to reduce noise. Oxford Quantum Circuits in the UK has developed a 3D cavity design that improves qubit isolation, allowing for higher coherence times within compact refrigeration units.

But scaling beyond a few thousand qubits will require breakthroughs in cryogenics and packaging. Current dilution refrigerators max out at around 10,000 qubits due to heat load and physical space constraints. IBM and Microsoft are exploring new cryogenic control chips — such as IBM’s Goldeneye — that operate at higher temperatures (around 4 kelvin) to reduce the number of wires penetrating the coldest stages. These cryo-CMOS controllers could drastically simplify system architecture. In parallel, companies like Bluefors and Oxford Instruments are developing larger, more powerful refrigerators capable of housing multiple quantum processors. Without these infrastructure advances, even the most advanced chips will remain laboratory curiosities.

The Bigger Picture: Where Practical Applications Stand

Despite rapid hardware progress, practical quantum applications remain limited. No quantum computer today can crack modern encryption, simulate large molecules for drug discovery, or optimize complex supply chains at scale. Most current use cases are hybrid: quantum processors assist classical computers in specific subroutines. For example, IBM has partnered with Mercedes-Benz to explore quantum algorithms for battery material simulation, focusing on lithium sulfur chemistry. While these efforts haven’t yet produced commercially viable results, they help refine software tools and identify where quantum could eventually add value.

Industries like finance and logistics are running small-scale experiments. JPMorgan Chase and Goldman Sachs have tested quantum algorithms for portfolio optimization and option pricing, though classical methods still outperform. In aerospace, Airbus and Boeing are investigating quantum computing for aerodynamic simulation and composite material design. These partnerships are less about immediate returns and more about building expertise. The real economic impact of quantum computing likely won’t arrive before 2030, according to estimates from McKinsey and the Boston Consulting Group. Even then, early applications will probably be narrow: specialized simulations in chemistry, quantum-aware cryptography, and optimization problems with constrained variables.

Meanwhile, national security agencies are preparing for a post-quantum future. The U.S. National Institute of Standards and Technology (NIST) has finalized its first set of post-quantum cryptography (PQC) standards, selecting algorithms like CRYSTALS-Kyber for encryption and CRYSTALS-Dilithium for digital signatures. These are designed to resist attacks from future quantum computers capable of running Shor’s algorithm. Companies across sectors are beginning to inventory their cryptographic systems and plan for PQC migration, a process that could take a decade. IBM has already integrated PQC into its Z mainframe systems and is working with clients to audit their encryption protocols. The urgency isn’t about today’s quantum machines — it’s about preparing for the day when they can break current standards.

Quantum computing remains a long-term endeavor. IBM’s Condor chip is a milestone in scale, but the road to utility is paved with challenges in error correction, infrastructure, and software. The companies leading today are not just building faster processors — they’re constructing entire ecosystems, from cryogenics to cloud platforms. The winner of the quantum race won’t be the first to a million qubits, but the first to deliver a reliable, error-corrected system that solves a problem classical computers cannot. That moment may still be years away, but the foundation is being laid, one qubit at a time.

About AI Post Daily

Independent coverage of artificial intelligence, machine learning, cybersecurity, and the technology shaping our future.

Contact: Get in touch

We use cookies to personalize content and ads, and to analyze traffic. By using this site, you agree to our Privacy Policy.