LOGO

ionq plans to launch a rack-mounted quantum computer for data centers in 2023

AVATAR Frederic Lardinois
Frederic Lardinois
Editor
December 9, 2020
ionq plans to launch a rack-mounted quantum computer for data centers in 2023

Quantum computing firm IonQ revealed its ambitious development plans for the coming years today, a move similar to that made by IBM in September.

During our Disrupt event earlier this year, IonQ’s CEO and president, Peter Chapman, indicated that functional desktop quantum computers could be available within just five years. This prediction differs from those of many of the company’s rivals—who frequently employ distinct quantum technologies—but IonQ now states it will offer modular, rack-mounted quantum computers for data centers in 2023. Furthermore, the company anticipates its systems will attain widespread quantum advantage across numerous applications by 2025.

Prior to today’s announcement, Chapman showcased a prototype of the hardware currently under development for 2021 in an interview. This prototype is compact enough to fit on a workbench. The quantum chip itself is presently about the size of a half-dollar, and the company is focused on integrating the core of its technology, including all necessary optics, onto a single chip.

Image Credits: IonQ

“That is the primary objective,” he stated regarding the chip. “Once 2023 arrives, we can then scale in a different manner, simply instructing a manufacturer in Taiwan to produce 10,000 of these units. Scaling then becomes a matter of manufacturing, as none of the hardware components are inherently quantum,” he explained, although IonQ co-founder and chief scientist Chris Monroe quickly added that this is true “with the exception of the atoms.”

This is a crucial point, as IonQ’s decision to utilize trapped ion quantum computing as the foundation for its machines allows it to avoid the extremely low temperatures required by IBM and other companies to operate their systems. While some experts have questioned the scalability of IonQ’s technology, Chapman and Monroe readily dismiss these concerns. IonQ’s new roadmap projects systems with thousands of algorithmic qubits—which require a factor of 10 to 20 more physical qubits for error correction—by 2028.

“Once we reach approximately 40 algorithmic qubits in early 2024, we expect to see quantum advantage emerge in the field of machine learning,” Chapman elaborated. “And it is generally accepted that 72 qubits is the threshold for achieving quantum advantage more broadly. This would occur in 2025. As we progress into 2027, we anticipate having hundreds, potentially even over 1,000 qubits, by 2028. This will mark the beginning of full-scale fault tolerance.”

The number of algorithmic qubits—defined by IonQ as qubits usable in running a quantum algorithm—will see gradual increases. Other industry participants often refer to “logical qubits,” but IonQ employs a slightly different definition.

When discussing the comparison of different quantum systems, Chapman emphasized that “fidelity alone is insufficient.” He argued that the number of qubits, whether 72 or 72 million, is irrelevant if only a small fraction are functional. “A roadmap promising a vast number of qubits, such as ‘umpteen thousand,’ is of little concern to us,” he said. “Because we utilize individual atoms, I could present a small container of gas and claim to have a trillion qubits ready for computation! However, they would not be particularly useful. Therefore, our roadmap focuses on the number of useful qubits.”

He also contended that quantum volume, a metric favored by IBM and others in the quantum field, is not particularly insightful as the numbers can become excessively large. However, IonQ still utilizes quantum volume, defining its algorithmic qubits as the log(2, x)

Upon reaching 32 algorithmic qubits—compared to the 22 in its current systems—IonQ expects to achieve a quantum volume of 4.2 billion, a significant increase from the 4 million it currently claims.

As Monroe pointed out, the company’s definition of algorithmic qubits also incorporates variable error correction. While error correction remains a key area of quantum computing research, IonQ believes its high gate fidelity currently mitigates the need for it, and it has already demonstrated fault-tolerant error-corrected operations with a 13:1 overhead.

“Due to our inherently low error rates, we do not require error correction at this time with our 22 algorithmic qubits. However, to achieve 99.99% fidelity, we will incorporate a small amount of error correction—almost as a minor adjustment. The level of error correction is adjustable; it’s not an all-or-nothing proposition,” Monroe explained.

IonQ asserts that “other technologies, due to their lower gate fidelity and qubit connectivity, may require 1,000, 10,000, or even 1,000,000 qubits to create a single error-corrected qubit.”

To illustrate these concepts, IonQ today launched an Algorithmic Qubit Calculator designed to simplify the comparison of different systems.

In the near term, IonQ anticipates a 16:1 overhead for error correction—meaning 16 physical qubits will be used to create a single high-fidelity algorithmic qubit. Once it reaches approximately 1,000 logical qubits, it expects to employ a 32:1 overhead. “As you increase the number of qubits, you must also enhance fidelity,” Chapman explained, meaning IonQ will need to control 32,000 physical qubits for its 1,000-qubit machine in 2028.

IonQ has consistently maintained that scaling its technology does not require any fundamental breakthroughs. The company argues that by consolidating much of its technology onto a single chip, its system will become inherently more stable—as noise is the primary obstacle to qubits—partly because the laser beams will have shorter distances to travel.

Chapman, known for his proactive publicity efforts, even mentioned the company’s intention to test the stability of its quantum computer by flying it in a small aircraft. It is worth noting, however, that IonQ is considerably more optimistic about short-term scaling than its competitors. Monroe acknowledged this, but attributes it to the underlying physics.

“While those working with solid-state platforms are achieving impressive physics,” Monroe said, “they are making incremental progress each year. However, a 10-year roadmap based on a solid-state qubit relies on advancements in materials science. They may succeed, but it’s uncertain. The physics of atoms, however, is well-established, and we are confident in our engineering path forward because it is based on proven protocols and devices.”

“We don’t face a manufacturing challenge. If you require a million qubits? It’s readily achievable,” Chapman concluded.

#quantum computing#IONQ#data center#quantum computer#2023

Frederic Lardinois

Frederic contributed to TechCrunch for a period spanning from 2012 to 2025. Additionally, he established SiliconFilter and previously authored articles for ReadWriteWeb, which is now known as ReadWrite. His reporting focuses on areas such as enterprise technology, cloud computing, developer tools, Google, Microsoft, consumer electronics, transportation, and a wide range of other subjects that capture his attention.
Frederic Lardinois