The quantum tipping point - where are we today?

Chris Middleton Profile picture for user cmiddleton January 26, 2023
Summary:
Some of the world’s leading experts debate whether quantum technology means a moment of transformative change, or a slow march into a brand-new world

Image of quantum computing
(Image by Gerd Altmann from Pixabay )

The ‘quantum tipping point’ will supposedly come when quantum technologies – computers, communications, security, sensors, applications, and more – move out of research labs and become viable at scale. 

At that moment, quantum tech will be able to solve problems that classical computers cannot – working alongside them, not replacing them – and so tackle some of the world's biggest challenges. In turn, this may influence the power and competitiveness of nations.

Or is that the wrong way of looking at it? 

Jan Goetz is founder and CEO of IQM Finland Oy, which is already deploying quantum hardware in data centres. He believes the idea of a tipping point is misplaced, and draws comparisons with other epochal technologies: AI and fusion energy.

Speaking in Davos last week, he said:

If you look at fusion, the use case is very clear. And people say it will take x years until we get there [a US breakthroughs occurred in December], then we really have this tipping point. 

But quantum is not like this, because there is not this one use-case that people are looking for. There will be more and more over time as the technology becomes more and more powerful. It's an enabling technology for many use cases. 

With AI, we had the use cases decades ago, but there was a ‘winter’. The reason was the computing power on the hardware side was not sufficient to train the models and do all of this. And for quantum we are in a similar situation. We have many ideas of what you could do, but the technology is not yet powerful enough.

Then he added:

This doesn't mean that the systems we're building are useless. On the contrary, they're very useful and many people are using them around the globe. 

One reason is, if we believe in this technology and its use cases, there will be a huge shortage of talent, so we need [the systems] to train the people. And we can also do great science with the systems that we have. 

Just to give you one example, a lot of CO2 actually comes from global food production where we are using fertilizers. […] We don't fully understand this at the molecular level. And quantum computers could help us do that.

A long-horizon technology

Picturing today’s problems and tomorrow’s needs will be critical to accelerating and deepening what quantum technologies will achieve in the future. That is also the job of Professor Amy Webb of the Stern School of Business at New York University, who is CEO of risk/opportunity consulting firm The Future Today Institute. 

She said:

You have to think about the drivers of acceleration. 

The first is technology advancement and the components that are maturing. And there are challenges to overcome here. This is eventually going to be looked at as a general-purpose technology, like the steam engine and the internet before it. But there's no ‘switch’, there's no ‘day that quantum happens’. It’s a transition to get there. This is a long-horizon technology. 

The second is adoption. And the roadmap right now is uncertain, because this is a critical, emerging technology. Part of what will drive adoption is it being cost effective. But at the moment, supercomputers are still more cost-effective than investment in quantum. But that too will change over time. 

And the third, and most important, is commercialization. It's not enough to have a mature technology. You also need an ecosystem to support it. And that means executives being familiar with what quantum is. But there's not enough understanding yet, and therefore it doesn't trickle down to the rest of the organization.

At the moment, it’s AI that is driving people's decisions about where they're going to be. Also bioengineering. But really investing in the talent pool will drive quantum’s commercialization over time.

At present, much of that talent is coming through universities and research labs. Kohei Itoh is President of Keio University in Tokyo. He said:

The first IBM quantum computer we touched could only perform one step of calculation. We were excited that we were able to use a real quantum computer, but it was only one step. But within two months, they introduced a new quantum chip that allowed us to perform two steps. It doubled in two months. And that kept happening.

At present, it’s like a kindergarten kid. It can almost do everything, but not at the scale, not at the level that we can do. 

We work with eight companies, including three banks, and they came up with an idea. They wanted to find a correlation between two pairs of stock prices [he mentioned two airlines as an example]. But there are so many hidden correlations between a pair of what you think would be unconnected companies. 

Today's computers, even today's supercomputers, cannot perform such a large calculation. But we came up with this algorithm that will allow us to find such a correlation, let's say within five or 10 years, with the current expected growth of the IBM quantum computer. That means we can probably outperform a supercomputer within five or 10 years. 

But such correlations exist not only in the stock market, but also in biodata. If you find such correlations within biodata, then you actually discover new science.

Upending the system 

Reaching understanding of, investment in, and the commercialization of quantum demands other viable use cases too. Some are certainly exciting, including real-time simulation of cities so that infrastructure and resources could adapt on the fly. 

Accelerating drug discovery could be another game-changer, assuming pharmaceutical giants don’t feel an existential threat from losing current timescales of 13 years, on average, and billions of dollars. And if you can spot deep correlations between apparently unconnected data points, then you can also identify fraud.

As with fusion technology, the lurking problem for quantum may be at the user end: historic, vested interests in the status quo. Powerful forces may be ranged against companies that can produce cheap, abundant energy, for example, or fraud-free banking, or – frankly – any system that can no longer be gamed, subverted, and made to pay in the shadows. It would be naïve to pretend otherwise: deeply entrenched markets will act to protect themselves. 

Consider this: in financial services alone, the names of most major, high/main street banks have been attached to some form of crime, fraud, money-laundering, mis-selling, or market rigging this century. And that means something: that leeway is built in by design. Quantum tech appears to promise a future of exposing hidden connections.

But also, the opposite. Another use case for quantum technology will be ultra-secure communications. Webb explained how this will be both angel and devil, depending on whose shoulders it sits: 

There are massive problems right now with our undersea cables, and geopolitically, different countries try to tap into those to observe communications. Quantum communication means potentially not being able to observe those in real time, which would be a huge step forward [in security]. 

So, there are positive opportunities, and terrifying challenges.

Post-quantum security – aka building a quantum-safe world – is already a strategic and operational concern for governments. Not just the theoretical ability to establish truly secure channels, but also how to secure current channels, encryption, and the content they protect from quantum-enabled adversaries who could break those protocols apart. 

Challenges ahead

Back in the enterprise world, IBM has been in the quantum game for years, being an early provider of services in the cloud. Those will become increasingly viable to organizations as the technology evolves. Ana Paula Assis is IBM’s EMEA Chair. She explained Big Blue’s vision and said: 

We believe in a hybrid model where we are going to have classical computing – the traditional bits with AI, neurons – and the qubits of quantum all working in an orchestrated way, integrated way.

In its 2025 roadmap, IBM uses the phrase “the quantum-enabled supercomputer”. 

She continued:

So, the way we are approaching that in our roadmap is, how you address performance, how you generate value by doing more with less, because the exercise of running circuits on quantum is extremely onerous.

This year, we reached 433 physical qubits in a chip, and are getting to 1,000 by the end of this year. A tremendous advancement. Remember, the first quantum computer we developed in 2016 had just five qubits. 

But then you have to address how the qubits behave inside those chips, which is the issue of quality. […] The ability of those qubits to behave properly, according to quantum physics. And then you have to address the topic of speed: how many operations you can execute in circuits that we have?

But even for IBM – or perhaps especially for it – adoption is an equal challenge. She added: 

We are looking at the ecosystem for multiple aspects, first creating a network of partners, companies, and universities that are thinking about what the use cases are. And what are the problems that quantum is going to solve?

There’s also the developer community, where IBM’s QISKit language is designed to not only support native application development, but also bridge classical programming languages into quantum machine language. The good news is 450,000 people are already registered on the platform, she said, with 232 million downloads of the code.

My take

So, the world is accelerating into the quantum realm, regardless of whether a tipping point is the right description of the future. 

But the underlying challenge would seem to be persuading decision-makers to adopt a longer-term strategic vision than some are used to, as we battle short-term problems with increasingly tactical solutions.

Loading
A grey colored placeholder image