LOGO

Quantum Utility vs. Quantum Advantage: A New Metric

November 11, 2021
Quantum Utility vs. Quantum Advantage: A New Metric

The Evolving Landscape of Quantum Computing Goals

The field of quantum computing is characterized by rapidly shifting objectives as progress is made.

Initially, the focus was on achieving quantum supremacy – proving a quantum computer could perform a calculation beyond the capabilities of any conventional computer, even without a practical application.

The Shift from Supremacy to Advantage

Google asserted reaching this milestone with a significant publication in 2019; however, IBM voiced considerable doubt regarding this claim.

This pursuit of supremacy was largely a theoretical exercise, lacking immediate real-world implications.

Following Google’s announcement, the industry redirected its efforts toward attaining quantum advantage.

Quantum advantage is defined as demonstrating a tangible benefit – either in a business context or scientific research – by surpassing the computational power of the most powerful supercomputers in a pertinent application.

This metric provided a more practical benchmark for comparison and evaluation than quantum supremacy.

Potential applications driving the pursuit of quantum advantage include substantial advancements in areas like pharmaceutical research, financial modeling, and the creation of improved battery technologies.

Introducing the Concept of Quantum Utility

However, the focus on quantum advantage overlooks a crucial consideration.

Is it necessary to await the development of massive, million-qubit quantum systems to surpass supercomputers before recognizing the value of quantum computing?

Alternatively, should the emphasis be placed on measuring performance gains relative to the hardware currently utilized in classical computing – specifically, individual CPUs, GPUs, and FPGAs?

A potentially more beneficial aim for this emerging industry is the attainment of quantum utility.

Quantum utility signifies a quantum system exceeding the performance of classical processors with comparable dimensions, power consumption, and operational environment.

Ultimately, achieving quantum utility as quickly as possible may prove to be the most valuable path forward.

Speeding Up the Adoption of Quantum Computing

A thorough examination of quantum computing reveals its potential to fundamentally reshape IT infrastructure, business operations, the global economy, and society as a whole. The advent of quantum supercomputing, characterized by exponential processing speeds, error-corrected qubits, and a functional quantum internet, promises a future drastically different from our present.

However, mirroring the evolution of classical mainframes in the 1960s, quantum mainframes are anticipated to remain substantial and delicate systems for the foreseeable future. Their operation will necessitate extremely low temperatures and intricate control mechanisms.

The quantum computing industry could benefit from mirroring the trajectory of classical computing. The introduction of personal computers in the late 1970s and early 1980s saw companies like IBM consistently releasing updated models with incremental improvements.

A comparable market dynamic is crucial for the scaling and sustained growth of quantum computing. Sustained investment requires demonstrable progress, and simply awaiting quantum computers to surpass existing supercomputers isn't a viable strategy.

Regular releases of enhanced, increasingly “useful” quantum computers will generate the revenue stability needed to fuel the long-term investment essential for realizing the technology’s complete capabilities.

With a continuous stream of practical quantum systems tailored for diverse applications, the need to queue for processing time on limited cloud-based quantum mainframes diminishes. On-site quantum processors, integrated with existing classical infrastructure, become a viable alternative.

Certain applications demand instantaneous computation that “quantum in the cloud” cannot reliably deliver, or necessitate on-premise processing due to a lack of cloud connectivity.

Consider these potential applications of expanded quantum utility:

  • Real-time signal and image processing within autonomous systems and intelligent technologies, such as robots, self-driving cars, and satellites.
  • Implementation of Industry 4.0 solutions, including digital twins within manufacturing environments.
  • Deployment in distributed network scenarios, like defense applications requiring rapid analysis in battlefield conditions.
  • Integration as performance-enhancing accessories for conventional devices like laptops.

Achieving these “quantum accelerator” applications within the next few years will require room-temperature quantum computing in compact designs. Several research avenues are being explored, with nitrogen vacancies in diamonds emerging as a particularly promising approach for creating stable qubits.

Enabling Technologies for Quantum Computing

Room-temperature diamond quantum computing utilizes an arrangement of processor nodes. Each node incorporates a nitrogen-vacancy (NV) center – a defect within the exceptionally pure diamond lattice – alongside a collection of nuclear spins.

These nuclear spins function as the computer’s qubits, while the NV centers serve as quantum buses. They facilitate operations between the qubits and manage input/output processes.

The ability of diamond quantum computers to operate at room temperature stems from the diamond’s unique properties. Its ultra-hard structure provides a protective environment, extending qubit coherence times to several hundred microseconds.

Researchers at the University of Stuttgart in Germany have been instrumental in advancing diamond quantum computing. Their contributions span algorithm development, simulations, error correction, and achieving high-fidelity operations.

However, scaling these systems beyond a limited number of qubits presented a significant obstacle. Challenges related to qubit fabrication yield and precision hindered further progress.

Australian quantum scientists have since developed solutions to address these scaling issues. Their work also focuses on the miniaturization and integration of the electrical, optical, and magnetic control systems essential for diamond quantum computers.

This approach allows for an increase in qubit numbers while simultaneously reducing the size, weight, and power consumption of diamond quantum systems.

The team demonstrated the feasibility of creating compact and resilient quantum accelerators. These accelerators are suitable for mobile applications, including robotics, autonomous systems, and satellites.

Furthermore, they enable massively parallelized applications crucial for simulating molecular dynamics. This has implications for fields like drug design, chemical synthesis, energy storage, and nanotechnology.

Due to the distinct advantages offered by diamond-based computing, a global research initiative is currently underway. Leading institutions, such as the University of Cambridge and Harvard University, are actively involved.

The Australian National University’s research in diamond-based quantum computing has transitioned into an early stage of commercialization.

Alternative room-temperature quantum computing technologies, like trapped-ion and cold-atom systems, are also progressing in relatively small form factors. However, these typically require either vacuum systems or precise laser setups.

One quantum computing company has successfully created a trapped-ion system contained within two server racks. Whether these systems can be further miniaturized remains an open question.

Re-evaluating Foundational Beliefs

For the quantum computing sector to realize its potential of delivering practical quantum advantages, its technological development must align with the established scalability of semiconductor production. This necessitates the creation of qubits through processes that allow for robust, reliable, and long-lasting integration with control mechanisms.

Drawing parallels from the evolution of classical computing, the most effective path forward involves reducing size and creating fully integrated quantum chips. This mirrors the initial development of transistors in the 1960s, which paved the way for widespread classical computation.

The primary technical hurdle in achieving broad quantum applicability will be the manufacturing of these integrated quantum chips. However, once this fabrication challenge is overcome, the resulting devices should be relatively straightforward to implement and utilize, much like their classical counterparts.

Even with a smaller qubit count compared to large-scale quantum computers, the first commercially available integrated chips are poised to become central to the industry and attract significant market interest.

The potential consequences of this advancement are far-reaching; a quantum system capable of operating at room temperature could fundamentally alter problem-solving across virtually every field. This underscores the urgency for product designers, software engineers, market analysts, and societal observers to familiarize themselves with quantum computing principles.

In the immediate future, practical quantum computers are expected to significantly reshape supply chains and even entire value networks. Successfully navigating this disruption requires a comprehensive understanding of both the technology itself and its broader economic implications, alongside recognizing the substantial investment prospects within this rapidly evolving landscape.

The realization of quantum utility also suggests a future where quantum computing is not monolithic. Quantum accelerators can coexist with larger quantum mainframes, serving distinct purposes and applications. This will foster collaboration rather than solely competition, ultimately accelerating both innovation and the rate of adoption within the quantum industry.

#quantum computing#quantum advantage#quantum utility#quantum supremacy#quantum algorithms