Skip to content
KINJA
Interior of a quantum computer with golden processors
Science & Space

Quantum Computing, Explained for People Who Stopped Reading at "Qubit"

Google's quantum chip just solved a problem 13,000 times faster than the world's most powerful supercomputer. Microsoft built a processor using a particle that may not even exist. IBM says practical quantum advantage arrives this year. Here's what any of that actually means.

Alex ChenAlex Chen·8 min read
||8 min read

Key Takeaway

Google's quantum chip just solved a problem 13,000 times faster than the world's most powerful supercomputer. Microsoft built a processor using a particle that may not even exist. IBM says practical quantum advantage arrives this year. Here's what any of that actually means.

Quantum computing has been "five years away" for roughly twenty years, which is the kind of track record that earns permanent residency in the "I'll believe it when I see it" category. But something shifted between 2024 and 2026. Google's Willow chip solved a physics simulation 13,000 times faster than the Frontier supercomputer, the fastest classical machine on Earth. Not a synthetic benchmark designed to make quantum hardware look good, but a verifiable algorithm whose results could be independently confirmed. Microsoft unveiled a processor built on an entirely new type of qubit that, if it works as claimed, could make today's quantum machines look like prototypes. And venture capitalists poured nearly $4 billion into quantum startups in the first three quarters of 2025, almost triple the previous year's total.

The technology is no longer theoretical. But understanding what quantum computers actually do, why they're different from regular computers, and what they can and can't solve requires starting from the basics. Here's an honest attempt at explaining it without requiring a physics degree.

Regular computers and their fundamental limitation

A regular computer (your laptop, your phone, a data center server) stores and processes information using bits. Each bit is either a 0 or a 1. Every file, image, program, and calculation is ultimately a string of zeros and ones. The computer manipulates these bits through logic gates (AND, OR, NOT) at extraordinary speed.

This system works brilliantly for almost everything: running apps, streaming video, training AI models, crunching spreadsheets. But it hits a wall with a specific class of problems where the number of possible solutions grows exponentially. Think of it this way: if you're trying to find the best route between 20 cities, a classical computer has to evaluate possibilities one at a time (or in clever batches, but still fundamentally sequentially). With 20 cities, there are roughly 2.4 quintillion possible routes. A classical computer checks them in sequence, which takes a while. Add five more cities and the number of routes explodes by a factor of millions. This exponential scaling is the brick wall.

Drug discovery hits this wall when simulating molecular interactions. Cryptography relies on it (RSA encryption works because factoring enormous numbers takes classical computers an impractically long time). Materials science, logistics optimization, financial modeling: all fields where the most interesting problems involve searching through possibilities that grow exponentially.

What makes quantum computers different (the actual physics, simplified)

A quantum computer replaces bits with qubits. A qubit can be 0, 1, or both simultaneously through a property called superposition. If that sounds impossible, you're in good company; it contradicts everyday intuition because quantum mechanics describes behavior at the subatomic scale, where particles genuinely exist in multiple states until measured.

Imagine flipping a coin. A classical bit is the coin after it lands: heads or tails. A qubit is the coin while it's still spinning in the air, simultaneously representing both outcomes. The moment you measure it (catch the coin), it collapses to one state. But while it's spinning, it's both.

The second quantum property that matters is entanglement. When two qubits are entangled, the state of one instantly correlates with the state of the other, regardless of distance. Measuring one qubit immediately tells you the state of its entangled partner. This allows quantum computers to process correlated information across many qubits simultaneously.

With 3 classical bits, you can represent one of 8 states (000, 001, 010, 011, 100, 101, 110, 111). With 3 qubits in superposition, you can represent all 8 states at once. With 300 qubits, you can represent more states simultaneously than there are atoms in the observable universe. This is where the power comes from: quantum computers don't just do the same thing faster; they explore exponentially more possibilities at the same time through the mathematics of quantum interference, amplifying paths that lead to correct answers and canceling paths that lead to wrong ones.

The error problem that defined the last 30 years

Here's the catch that kept quantum computing in the lab for decades: qubits are absurdly fragile. Any interaction with the environment (heat, electromagnetic radiation, vibration, a stray photon) causes a qubit to lose its quantum state in a process called decoherence. Most quantum computers operate near absolute zero (about -460 degrees Fahrenheit) because even room temperature is too "noisy" for qubits to maintain coherence.

This fragility means qubits make errors. Lots of them. A single error in a quantum calculation can corrupt the entire result. For 30 years, the central challenge was quantum error correction: finding ways to detect and fix errors without collapsing the quantum states you're trying to protect (which is trickier than it sounds, because measuring a qubit destroys the superposition you need).

The standard approach uses many "physical" qubits to create one reliable "logical" qubit. Current estimates suggest you need roughly 1,000 physical qubits to produce one error-corrected logical qubit using standard techniques. That means a quantum computer with 1,000 logical qubits (enough for many practical applications) would need about a million physical qubits. We're currently at around 100-1,000 physical qubits depending on the platform. The math didn't work.

What changed in 2024-2026

Three breakthroughs shifted the trajectory.

Google's Willow chip achieved "below threshold" error correction. This is the milestone the field has chased for nearly 30 years. "Below threshold" means that adding more physical qubits actually reduces the error rate rather than amplifying it. Previous quantum processors got noisier as they got bigger; Willow got more reliable. Google then used Willow to run the Quantum Echoes algorithm, simulating molecular structures 13,000 times faster than the Frontier supercomputer. This was the first time a quantum computer achieved verifiable quantum advantage on a real algorithm (not just a synthetic benchmark), and the results were published in Nature.

Microsoft unveiled Majorana 1, a processor using topological qubits. Topological qubits store information in the overall pattern of a quantum system rather than in individual particles, making them inherently more resistant to errors. If topological qubits work at scale (a significant "if"), they could reduce the physical-to-logical qubit ratio by a factor of ten, making million-qubit systems feasible far sooner. Microsoft claims this could enable practical quantum computers "in years, not decades."

IBM announced a clear path to quantum advantage by late 2026. IBM's 120-qubit Nighthawk processor achieved a 10x improvement in quantum error correction, one year ahead of their public roadmap. Their upcoming Kookaburra processor will be the first to integrate logical qubit processing with quantum memory, another essential building block for practical computation. IBM's roadmap targets a 200-logical-qubit system called Starling by 2028.

The industry also saw Oxford Ionics achieve 99.99% fidelity for two-qubit gates (essentially, 99.99% accuracy per operation) and a Harvard-led team successfully run algorithms on 96 logical qubits, the largest such demonstration to date.

What quantum computers will actually be used for (and what they won't)

Quantum computers are not faster general-purpose machines. They won't make your web browser load faster, render video games better, or run Excel more quickly. They're purpose-built for specific types of problems where quantum algorithms provide an exponential advantage over classical approaches.

Drug discovery and materials science: Simulating molecular interactions is the most promising near-term application. Understanding how a drug molecule binds to a protein receptor requires modeling quantum mechanical interactions between atoms, which is something classical computers can only approximate for small molecules. Quantum computers can model these interactions natively. Google's Quantum Echoes experiment already demonstrated this by simulating molecular structures that matched results from physical laboratory measurements.

Cryptography (and the threat of "Q-Day"): Quantum computers running Shor's algorithm could factor the large numbers that underpin RSA and ECC encryption, the systems protecting everything from your bank account to military communications. "Q-Day," the day a quantum computer can break current encryption, is widely estimated at 2030-2035. Governments and corporations are already transitioning to "post-quantum" cryptography standards published by NIST in 2024.

Optimization: Finding the best solution among astronomical numbers of possibilities (supply chain routing, portfolio optimization, airline scheduling) is a natural fit for quantum advantage. These applications are likely among the first to reach commercial deployment.

What they won't do: Replace classical computers for everyday tasks. Run AI models (ChatGPT and similar systems run on classical GPU clusters, not quantum processors). Solve every hard problem (many NP-hard problems don't benefit from quantum algorithms). Become consumer devices (they operate near absolute zero and require specialized infrastructure).

The three hardware bets competing to win

Not all qubits are built the same, and the major quantum computing companies are pursuing fundamentally different physical approaches. Understanding the distinction matters because it determines when (and whether) each company's technology reaches practical scale.

Superconducting qubits (Google, IBM): These use tiny loops of superconducting wire cooled to near absolute zero. They're the most mature technology with the most demonstrated results, but they require massive cooling infrastructure (dilution refrigerators the size of a room) and have relatively short coherence times (roughly 100 microseconds for the best systems). Google's Willow and IBM's Nighthawk are both superconducting. The advantage: well-understood physics and rapid iteration. The disadvantage: each qubit needs to be individually fabricated and calibrated, and the cooling requirements make scaling to millions of qubits an engineering nightmare.

Trapped ion qubits (IonQ, Quantinuum): These use individual atoms suspended in electromagnetic fields. They offer longer coherence times and higher gate fidelity (Oxford Ionics hit 99.99% accuracy for two-qubit operations) than superconducting qubits. But they're slower per operation and harder to scale because physically trapping and controlling individual atoms gets increasingly complex. IonQ is pushing toward 256 qubits in 2026 using barium atoms with microwave control that could make the technology more manufacturable.

Topological qubits (Microsoft): The wildcard. Topological qubits would store information in the collective behavior of exotic quantum states called Majorana fermions, making them inherently error-resistant. After years of false starts and a retracted Nature paper in 2021, Microsoft announced in 2023 that they'd finally created the prerequisite quantum states, and in 2025 unveiled their Majorana 1 processor. If topological qubits scale as promised, they could leapfrog both superconducting and trapped ion approaches. That's a significant "if," and the physics community remains cautiously optimistic rather than fully convinced.

Neutral atom qubits (Pasqal, Atom Computing) represent a fourth approach gaining momentum. Atom Computing demonstrated a 1,225-qubit system in 2024, and Pasqal plans a 10,000-qubit system by 2026. Neutral atoms can be arranged in flexible 2D and 3D arrays using optical tweezers, which makes scaling more straightforward than individually wiring superconducting circuits. Harvard's 96-logical-qubit demonstration used neutral atoms, suggesting this approach may mature faster than expected.

The honest assessment: nobody knows which approach wins. It's possible that different hardware platforms dominate different applications (superconducting for speed-critical computations, trapped ions for precision, neutral atoms for scale). The quantum computing industry attracted roughly $17.3 billion in cumulative investment by 2026, up from $2.1 billion in 2022. That level of capital is betting on multiple horses, not a single winner.

When this matters to you (the honest timeline)

The industry converges on a rough consensus: fault-tolerant quantum computers capable of solving commercially relevant problems arrive between 2029 and 2035. Most roadmaps from Google, IBM, and Microsoft target useful systems by 2029-2030, with broad commercial impact following in the early 2030s.

Today, quantum computers are accessible through cloud platforms (IBM Quantum, Amazon Braket, Google's quantum cloud, Microsoft Azure Quantum). Researchers and businesses can run experiments on real quantum hardware remotely. But the current generation is in what physicists call the NISQ era (Noisy Intermediate-Scale Quantum): useful for research, not yet reliable enough for production workloads.

The practical implications for most people boil down to two things. First, if your job involves cybersecurity, start paying attention to post-quantum cryptography now; the transition will take years and the deadline is approaching. Second, if you work in pharmaceuticals, materials science, logistics, or finance, quantum computing will eventually change what's possible in your field, and the companies investing in quantum readiness today will have an advantage when the technology matures.

For everyone else, quantum computing in 2026 is the most important technology you don't need to worry about yet, but probably should understand. The coin is still spinning. It hasn't landed. But for the first time, the physics says it's going to.

Topics

Alex Chen

Written by

Alex Chen

Technology journalist who has spent over a decade covering AI, cybersecurity, and software development. Former contributor to major tech publications. Writes about the tools, systems, and policies shaping the technology landscape, from machine learning breakthroughs to defense applications of emerging tech.

Continue Reading in Science & Space

Red rocky surface of Mars with distant mountainsScience & Space

We Might Have Found Evidence of Life on Mars. Here's Why Scientists Won't Say That Yet.

In September 2025, NASA published the most compelling evidence ever found suggesting that ancient microbial life may have existed on Mars. A peer-reviewed paper in the journal Nature describes a rock sample containing "potential biosignatures," the chemical and mineral traces that microbes leave behind when they eat, excrete, and die. The discovery was called "the closest we have ever come to discovering life on Mars." And every scientist involved immediately told the public to slow down.

Alex ChenAlex Chen·8 min read

The Kinja Brief

Get the stories that matter, delivered daily.