For years, quantum computing has promised to revolutionise fields from drug and materials discovery to AI, finance and cybersecurity.  But turning those ambitions into reality first requires solving a fundamental problem. Qubits, the basic units of quantum computation, are inherently unstable, losing the information they store at the slightest hint of heat, vibration or electromagnetic interference.

That fragility has made quantum error correction (QEC) the field’s defining challenge. QEC stabilises quantum information by linking many physical qubits so that together they represent a single logical qubit that can survive random faults. If one of the linked qubits throws up an error, the system detects and corrects it. But those constant checks add complexity and slow performance.

That’s beginning to change. A series of announcements shows QEC shifting from abstract theory to practical engineering, collectively signalling steady progress towards quantum computers capable of running useful tasks.

Algorithmic fault tolerance accelerates error correction

Boston-based QuEra and collaborators at Harvard and Yale have developed a new framework for quantum error correction called Algorithmic Fault Tolerance (AFT), unveiled in September. The approach tackles one of QEC’s biggest performance bottlenecks: the repeated checks needed to preserve a logical qubit’s integrity.

Traditional error-correction codes become more robust by increasing their code distance – the number of physical errors they can tolerate – but each step up in distance multiplies the number of correction cycles required.

“Because of the unique properties of neutral atoms, we can make substantially fewer rounds of error correction,” says Yuval Boger, QuEra’s chief commercial officer. “That reduction in time can turn a calculation that used to take a year into one that takes five days.”

AFT combines two techniques: transversal operations, which apply a single control pulse across matched sets of qubits to prevent errors from spreading, and correlated decoding, which interprets patterns of related faults rather than treating them as isolated events.

“If you understand the correlation between errors, you can make smarter decoding,” Boger says. “Together, these enable large reductions in runtime without sacrificing accuracy.”

The method plays to the strengths of neutral-atom quantum computers, where atoms can be rearranged freely and operated on in parallel.

“If I can move the atoms, I can talk directly to the qubit I need,” Boger explains. “If I can’t, I have to pass the message along, and by the time it reaches the end, it’s corrupted.”

He likens the principle to error correction in classical networking. “When you download a file, TCP/IP ensures every packet arrives correctly. Without that, the file would be corrupted,” Boger says. “Quantum error correction does the same for quantum information, and we’re finally learning how to do it fast enough.”

A photo of QuEra's Aquila magneto optical trap.
QuEra’s Aquila magneto optical trap. (Photo: QuEra)

Logical qubits show early hardware gains

While QuEra is focused on speeding up the maths behind error correction, other researchers are tackling the challenge in hardware. Infleqtion’s recent demonstration of logical qubits aims to prove that fault tolerance can be achieved on real devices, not just in simulation.

Infleqtion has taken a significant step toward practical error correction by showing that logical qubits can now outperform the physical qubits they’re built from. It’s the first time this advantage has been demonstrated on hardware, marking a critical milestone on the road to fault-tolerant quantum computing.

“Over the last few years, we’ve had relatively good physical qubits,” says Professor Fred Chong, Infleqtion’s chief scientist for quantum software. “But what we need is higher accuracy, and that means moving computation from the physical realm to the logical realm.”

Infleqtion’s system stores quantum information in caesium atoms suspended in laser light, which can be repositioned with remarkable precision. That reconfigurability, paired with its Superstaq software layer, lets engineers create and test the entanglement patterns needed for scalable error correction.

By layering error-corrected qubits onto this system, Infleqtion has demonstrated error rates roughly halved compared with unprotected qubits. Company researchers have even run a version of Shor’s algorithm on error-protected hardware, marking a world first and clearly signalling that fault-tolerance is moving from concept to practice.

The company plans to grow from dozens to hundreds of error-protected qubits by 2030, a step it says will bring utility-scale quantum machines within reach. Its latest results also point to a shift in tone across the field; error correction is no longer just theoretical but is now being tested on today’s machines.

Building tools and talent for quantum error correction

Infleqtion’s progress shows how far the hardware has come. The next challenge is giving engineers the tools and skills they need to work with error correction at scale. Even small logical systems will generate large volumes of error data that must be decoded and acted on in real time.

That’s where Cambridge-based Riverlane comes in. Its open-source Deltakit platform is designed to bridge that gap by accelerating development and standardising approaches across the industry.

“Quantum error correction is critical. We simply won’t get to useful quantum computing without it,” says Abe Asfaw, Riverlane’s head of QEC enablement. “The best qubits today still have error rates of one in 1,000 to one in 10,000. Useful applications, like new material design or cryptanalysis, require billions of error-free operations. Quantum error correction is the only way to bridge that gap.”

Deltakit lets researchers simulate QEC behaviour and test different approaches rapidly, without committing to full hardware integration upfront. For companies that move to real-time decoding, Riverlane offers Deltaflow, a decoder designed to keep pace with real-time qubit operations.

“Deltakit is part of the way we enable the broader community to do quantum error correction successfully,” Asfaw says. “It helps partners estimate how the system will behave before they invest in integrating a real-time decoder like our Deltaflow product.”

The platform is hardware-agnostic, abstracting away device-specific control complexity.

“Different qubit platforms have different strengths. Our goal is to make experimenting with QEC as easy as possible, regardless of hardware,” Asfaw explains. “That means intuitive tooling, visualisation and orchestration between decoders, control systems and the processor.”

Riverlane also wants to tackle the QEC talent gap, now considered one of the most significant barriers to scaling quantum systems. The company supplies QEC solutions to around two-thirds of the world’s quantum computing companies and sees training as part of its remit.

“We take upskilling seriously. Deltakit comes with documentation and a textbook so anyone can start experimenting, not just learning theory,” Asfaw says, adding that where others offer individual tools, Riverlane’s roadmap is evolving towards a full solution stack.

“The role of Riverlane is shifting from individual products to quantum error correction solutions,” he says. “We want to accelerate every company on the journey to fault tolerance.”

Tools like Deltakit show that the software and skills ecosystem around error correction is beginning to take shape.

A photo of Riverlane's DeltaKit.
Riverlane’s DeltaKit. (Photo: Riverlane)

Engineering challenge offers reality check

The growing momentum behind these breakthroughs is unmistakable, but not everyone is convinced that progress is fast enough to make quantum computing viable in the near term.

Dr Carl Williams, principal at CJW Quantum Consulting and former deputy director at the US National Institute of Standards and Technology, cautions that despite the progress, the field still faces hard limits on resources, verification and scale.

“For all the excitement around faster, more reliable qubits, QEC remains a formidable engineering problem,” he says. “Hardware has finally caught up enough to support real explorations of quantum error correction. But implementing these codes isn’t straightforward. They depend heavily on the architecture. Neutral atoms make it easier, ions a bit less so, and superconducting systems are much harder to wire for the interconnectedness these codes need.”

Decoding errors in real time adds another layer of complexity, Williams notes, as more intricate codes require proportionally greater classical computing power running alongside them. Even if overheads fall, the technical challenge continues to grow. He expects progress to be steady rather than sudden.

“People talk about big breakthroughs coming next year,” says Williams. “I don’t buy that. We’ll see real applications emerging, but only where the hardware supports them. And that’s still the bottleneck.”

Despite his scepticism, he remains cautiously optimistic. “It’s getting easier to see how we’ll reach economic value or cryptanalysis, but not on dramatically shorter timescales,” says Williams. “Quantum error correction is becoming real, but it’s still an engineering problem; one we’ll solve only when hardware, software and algorithms mature together.”

Towards a fault-tolerant quantum reality

After years rooted in theory and simulation, quantum error correction is taking shape across hardware, software and control systems. The latest advances highlight three clear areas of focus: improving algorithmic efficiency, demonstrating fault tolerance in hardware and building the supporting software and skills to make it scalable.

As Williams notes, these breakthroughs remain part of a long engineering journey, one where every advance exposes the next bottleneck. Fully fault-tolerant or cryptographically relevant quantum computers may still be some years away, but the trajectory is unmistakable. Quantum error correction is no longer a distant ambition; it is fast becoming the discipline that will determine when and how quantum computing becomes truly reliable.

Read more: What a new mega-worm says about open source risk