
Last week I summarized takeaways from my visit
to SXSW earlier this month. As I mentioned in that article, the most hyped
technology at the event was quantum computing. I found this to be quite
curious, especially considering that even the most optimistic fans of quantum
computing believe that practical use is still 3-5 years away, and most
pragmatists think we have at least a decade to go.
I’ve spent some time thinking about this and
thought I’d offer an update on the quantum industry as a whole. Before we begin, it may be useful to give a
bit of background on what quantum computing is, and the hurdles still to be
overcome in maturing the technology.
How do quantum computers work?
Quantum computing is a type of computing
that uses the principles of quantum mechanics to process information. Unlike
classical computers, which use bits (0 or 1), quantum computers use qubits, which can be both 0 and 1 at
the same time due to a trait called superposition.
Quantum computers also use entanglement, a property where qubits
become linked and the state of one affects the other, even at a distance. These
properties allow quantum computers to perform certain calculations much faster
than classical computers by exploring many possible solutions simultaneously.
Classical computers solve problems
deterministically, using the kind of classical math we learned in matrix
algebra, for example. If you want to speed up the search for a complex answer
or to do many operations at once, like is common with AI, you need to parallel
process trying many solutions at once with more and more computers. At scale,
this requires massively expensive and energy consuming data centers. Our
increasing demand for compute power drives companies like NVIDIA and AMD to
continue to increase the number of transistors they can fit onto processor
chips and cloud providers to build planet scale data centers.
Quantum computing by comparison solves
problems probabilistically, leveraging entanglement and superposition to solve
many different calculations at the same time with the same qubits. This is
dramatically lower cost and lower energy at scale. The largest quantum computer
today consumes about 10 kW of power to operate. At scale, it is expected that a
quantum computer may require a few megawatts to operate. By comparison,
Northern VA data centers alone consumed more than 2.5 gigawatts in 2023.
One of the fundamental challenges in
conducting quantum computing at scale, is that qubits are sub-atomic particles
that need to be cooled to near absolute zero temperature in order to hold their
quantum states. And in order to create mathematical logic used in computing,
you need to maintain many qubits, all interconnected without disturbance in
their physical state.
Decoherence is the term for when qubits lose their quantum state, which leads to
errors in the quantum calculations. Maintaining coherence of qubits is arguably
the number one challenge associated with scaling quantum computers to the size
needed to run calculations that compete with the big data centers of today. The
primary strategy for error correction related to decoherence is to add
additional, redundant qubits which is not ideal. But there are other approaches
in research, as I’ll describe below.
How is industry solving quantum
challenges?
While in Austin, I attended a talk by
Arvind Krishna, CEO of IBM. IBM is
recognized as the large industry leader in quantum development today. IBM’s
Qiskit is by far the most utilized tool for programming quantum computers
(available to anyone), and IBM is a leader in making their quantum computers
accessible for use today through a web interface.
Quantum computers today are typically
100s of qubits, with the first 1000-qubit computer only recently developed.
Krishna believes that to reach a level of computing power that is useful for
solving complex applications like pharmaceutical drug discovery or optimized
logistics and supply chain routing we will need quantum computers with many
thousands of qubits. To get there we need to improve coherence by roughly 100x,
which will lead to error correction of more than 10x.
The good news is that there have been a
number of big advances and announcements within the last year. This may be part
of why the hype was so high at SXSW.
Here are a few of the more notable:
●
Most major tech players are
announcing new quantum chips
○
In February, Microsoft announced
Majorana 1, a quantum chip that uses topological superconductors. These
semiconducting materials are exotic quasi-particles that are their own
antiparticles. Topological materials are much more stable, so their quantum state
is robust against local noise and disturbances. If the materials behave as
Microsoft claims, this may be a big step forward for quantum coherence.
○
Also in February, Amazon announced
the Ocelot chip, a quantum chip designed with fault tolerance as the core
feature of the qubit architecture. They state goals to reduce quantum error by
90% as they scale this chip. Amazon also has added support for Rigetti’s
84-qubit Ankaa-2 quantum processing unit to Amazon Braket, AWS’s quantum
computing service. This integration provides users with access to more advanced
quantum hardware for experimentation and application development.
○
Not to be outdone, Google
introduced the Willow processor, a 105-qubit superconducting quantum computing
chip back in December. Willow achieved below-threshold quantum error correction
and demonstrated significant progress in computational speed.
●
D-Wave, one of the more notable
quantum startups, claimed to have achieved “quantum supremacy” earlier this
month.
They claim to have used a quantum computer to solve a complex magnetic
materials problem in under 20 minutes. The problem, they believe, would have
taken nearly a million years for a classical computer to solve. While the claims
were made in the Journal of Science, a peer-reviewed publication, I should note
that there is a lot of debate in the scientific community about the accuracy of
the claims. The achievement may have been simple “quantum advantage” – solving
a problem better than a traditional computer – versus “quantum supremacy” which
is solving a problem a traditional computer would never be capable to solve.
Despite the debate, D-Wave’s stock surged on the announcement.
●
Many new investments are being
made:
○
NVIDIA held its first-ever Quantum
Day at the GTC 2025 conference, and CEO Jensen Huang announced plans to
establish a quantum computing research lab in Boston, collaborating with
Harvard and MIT.
○
IBM announced investment in a new
quantum research park in Chicago, working with $25M in state matching
funds from the state of Illinois.
Illinios Governor J.B. Pritzker joined Krishna at SXSW to support this announcement
and a further commitment of $500M over the coming decade to support quantum
research and industry development. Several universities and the area’s two
national laboratories are also involved.
When will quantum be practical?
IBM is bullish on quantum. Krishna
believes a major breakthrough may be as soon as 3 years away and certainly not
more than 5. On the other end of the spectrum, NVIDIA’s CEO has estimated that
practical quantum is still 30 years in the future (which is interesting with
the announcement I just mentioned – I’ll hit on that contradiction further
down).
So who is right? Quantum computing is one of those
technologies that always seems to be 20+ years away and never getting closer.
But then again, the same claims could be made about nanotechnology, and here we
are in the sub-atomic age, making semiconductors at single-digit nanometer
scale. I think the answer to the timing question has to do with what “quantum”
we are speaking about.
There are at least three major types of
quantum technology that are in commercial use. Quantum sensing, quantum
security and quantum computing. Quantum sensing is by far the furthest advanced
industry with devices available for defense, medical, industrial and energy
markets.
●
Quantum inertial sensors leverage
entanglement, superposition and a trait called quantum tunneling to provide
positioning accuracy that is far more precise than GPS. These sensors are
useful for defense and autonomous vehicle applications.
●
Optically pumped magnetometers are
quantum devices for ultra-sensitive biomagnetic signal measurement like brain
and heart activity.
●
Nitrogen Vacancy (NV) center
diamond sensors can map magnetic fields at nanometer resolution for biological
and advanced materials applications.
Quantum security and quantum encryption
are getting a lot of attention right now. The industry standard for data
encryption worldwide is to use is AES-256 or Advanced Encryption Standard with
a 256 bit security key. This is a NIST standard used across banking, healthcare
and defense markets. An even more secure method to protect data is called RSA
(Rivest-Shamir-Adelman), similar to AES, but with much more complex
mathematical properties and security keys up to 4096 bits. RSA-style security
is suitable for encrypting even the most sensitive data. To brute-force break
encryption with a 2048 bit key would take today’s computers billions of years
(a nation-state could break a 1024-bit key in months to years).
The problem is that quantum computers
will be quite well suited to breaking RSA-style security. The prediction that
we will reach a post-quantum world within a few years has many organizations
worried about the theft of protected data. There is evidence that adversaries
today are stealing encrypted information to store until later, with hopes to
decrypt it once quantum advances sufficiently.
I attended a fascinating talk given by
Andre Konig, the CEO of Global Quantum Intelligence, an analyst organization
covering the industry. He described that
today’s quantum computers can achieve roughly a kiloquop of calculations (a
quop is a quantum operation, analogous to logic operations on a classical
computer). For practical applications of
quantum computing, we will need to get to gigaquop capabilities. But if we hit
a petaquop, quantum computers will be able to break traditional encryption.
Andre predicts that gigaquop capability
is 5-10 years away. Maybe IBM is right and we hit it in 3. If Microsoft’s
topological materials, or Amazon’s hybrid architecture prove out, we’ll be on
the early side of the timeline. He noted that Google has advanced amazingly
fast in just the last 2 years. [He also is a skeptic of D-Wave’s announcements,
believing they have much more to do with boosting their stock price to attract
capital needed for additional research and experimentation].
Petaquop will take longer, but not as
long as some predict. 10 years is a good estimate for how long we have before
quantum computers can break the core security upon which huge parts of our
society operate. For national security, it is extremely important to be the
first to achieve this milestone. If one of our adversaries reaches quantum
supremacy before us, who knows what kind of chaos could be unleashed.
Next week I’ll dig into the quantum arms
race and who is in the best position to win – through both the corporate and
public sector lenses. As a teaser – one piece of good news is that perhaps the
most significant quantum startup is based right here in the Triangle. The bad
news is that there are a lot of places investing far more than North Carolina.