By
 Cliff Saran, Managing Editor
Published: 28 Sep 2021
The realm of artificial intelligence (AI) has surfaced several dilemmas that society faces in terms of the friction between ethics and liberal values and how they mesh with the advancement of computer technology.
Although attaining the full power of quantum computing may be years away, the prospect of quantum superiority, where it would be impossible to simulate algorithms run on a quantum computer using a classical computer, raises serious questions about how to prove that the quantum computer is producing the correct results.
Given the problem space, in terms of the vast number of parameters a quantum computer can pull in to find an answer, how can a mere human fathom whether the processing makes any sense? “If a quantum computer can efficiently solve a problem, can it efficiently convince an observer it is correct?” says Marc CarrelBilliard, global technology innovation lead at Accenture.
Researchers are developing a greater understanding of where quantum computing can be used. Heike Riel, IBM fellow, head of science and technology and lead of IBM Research Quantum Europe at IBM Research, says: “It is not about the beauty of tech. We want to generate value – it’s a journey, develop the technology, find the sweet spot of early applications, see and demonstrate value, then expand hardware and software.”
For instance, Eon Energy recently joined the IBM Quantum Network. In the past, there were few energy sources, but as the world transitions to more green sources such as solar and wind, there are now many more sources where energy is being created. Quantum computing could be applied to help distribution grids fulfil a much wider range of tasks, especially if, in the future, many smaller companies and households feed energy into the grid via their own photovoltaic (PV) systems or electric cars through initiatives such as Eon’s Vehicle to Grid (V2G) project.
In this project, batteries from electric vehicles are connected to the distribution grid as a flexible storage medium. In this way, fluctuations in the generation of renewable energies can be balanced out. Quantum computing could be used to control these processes more efficiently and effectively.
Riel says: “All these sources have different dependencies and so predictions get more complex. We have to optimise the system and would like to do this in real time. Complexity increases exponentially as the number of parameters increases and that becomes a hard problem to solve using classic computing.”
Quantum chemistry
Bob Coecke, theoretical physicist and chief scientist at Cambridge Quantum Computing, points out that atoms and molecules are governed by the laws of quantum mechanics, which means it should be possible to model their behaviour on a quantum computer. “Simulating physical stuff is exponentially expensive due to the structure of quantum mechanics,” he says. “These things are quantum native – they want to live on a quantum computer and it is artificial for them to live on a classical computer.”
In fact, just looking at the storage required, Coecke says it would be impossible to fit such problems into a classical computer.
Modelling the behaviour of atoms and molecules has been the main driving force behind using quantum computers to simulate new materials. In August last year, Nicholas Rubin and Charles Neill, research scientists at Google AI Quantum, wrote a blog discussing an experiment to create a complex chemical simulation using a HartreeFock model from computational physics.
“Accurate computational prediction of chemical processes from the quantum mechanical laws that govern them is a tool that can unlock new frontiers in chemistry, improving a wide variety of industries,” the researchers wrote. But in the blog, they acknowledged that algorithms for simulating chemistry on nearterm quantum devices must account for errors that occur on quantum computers.
In a similar way to how classical neural networks can tolerate imperfections in data, the pair said in their experiment, the quantum processor equivalent of a neural network, called a variational quantum eigensolver (VQE), attempts to optimise a quantum circuit’s parameters to account for noisy quantum logic.
Mark MattingleyScott, Quantum Brilliance
While the ability to simulate a complex chemical process on a quantum computer is astounding, scientists understand how the interactions of subatomic particles achieve a particular outcome. It can be written down and calculated using a chemical equation.
IBM’s Riel says that as long as the number of qubits is small enough, results can be simulated on a classical computer. She says IBM is working on understanding how noise influences the system by giving erroneous results. As IBM continues on the next stage of its quantum computing roadmap, with a 128 qubit quantum system, “we want to demonstrate error correction and we are working to verify the results”, Riel adds.
Discussing the challenges, Mark MattingleyScott, managing director of Europe for Quantum Brilliance, says: “One of the paradoxes of quantum computing is that once we reach the point of quantum utility, in other words where a quantum algorithm can perform calculations at a speed and precision that are not possible on a classical computer, then it becomes impossible to directly verify the correctness of the results. We can verify the correctness of the method with smaller versions of the same problem – something we do every day with classical algorithms – but there will be no way to actually check.”
Because quantum computing is inherently nondeterministic, MattingleyScott points out that the results it produces are based on probabilities. “A quantum algorithm works by using quantum mechanics to constructively reinforce the ‘right’ answer, and destructively suppress the ‘wrong’ answers,” he says. “So there will always be some uncertainty. Using a classical computer to validate a quantum computer will only be possible at the methodological level, not at the level of actual data.”
The bigger picture
However, Cambridge Quantum Computing’s Coecke believes that the principle of compositionality and category theory can help in understanding what is actually going on in a quantum computer. Compositionality uses the idea of thinking of a problem both from the top down and bottom up. “All mathematics works from the bottom up,” he says. “We use a new form of compositionality where the ‘whole’ defines the parts.”
Traditionally in computer science, a complex system is defined by building and testing smaller known parts which, when integrated, behave in a way defined by the sum of the parts. “Category theory is about how something relates to a bigger picture,” says Coeck. “It is an important part of computer science.”
He adds that category theory is used to structure how programs fit together. This is the way flow charts are used to portray what a program is doing or an electrical diagram shows how a battery and switch connect to a light bulb.
The concept of a picture being worth a thousand words is something Coecke explores in a book he coauthored with Aleks Kissinger, Picturing quantum processes. While the target audience of this book is university students, Coecke is also publishing a book aimed at teenagers, illustrating the point that almost anyone can understand quantum computing.
Broadly speaking, the book looks at the idea of using a compositional tool to break down big problems into small parts. According to Coecke, these small building blocks can be composed in a way that is understandable and, significantly, where all the constituent pieces can be verified.
Verification and explainability
Going back to chemistry, Michael Biercuk, CEO and founder of QCTRL, says: “Right now, it’s very difficult to design a molecule with a specific array of target properties, in part because of limits on our computational modelling capabilities. However, it is straightforward to measure a candidate molecule’s properties against that list. In the case of molecular structure or chemical dynamics computed on a quantum computer, we may not be able to perform a validating classical simulation, but we can generally perform a real chemistry experiment to validate the outputs.”
Similarly, Quantum Brilliance’s MattingleyScott believes that one possible role for quantum computers is in providing drastically accelerated performance. “It might be able to perform sensitivity analysis on the solutions to problems, to determine ‘what if’, and based on that, provide answers,” he says. “The ability to do this is a kind of holy grail of quantum computing, and such a capability would revolutionise industry and society.”
As the world edges towards quantum superiority, the experts that Computer Weekly spoke to agree that it will become more and more difficult to prove that the output a quantum computer produces is correct. Coecke’s topdown pictorial approach, using verifiable building blocks, may be one way to develop complex quantum algorithms. The idea of verification through realworld experiments, as MattingleyScott suggests, may become the quantum equivalent of integration testing in classical computing.
But as IBM’s Riel points out, quantum computing is one of many approaches software developers will have to tackle complex problems. “If you have a problem to solve in optimisation, you don’t care how it is done as long as it is achieved in the fastest and most efficient way,” she says. “You don’t want to worry about what computer is used.”
From IBM’s perspective, a complex computational problem may require different building blocks, where some parts are processed using classical computing, while others take advantage of quantum computing. Riel adds: “You need to have developers who understand quantum computing, to develop the quantum primitives that implement algorithms. You then need a model developer who doesn’t need to understand the depth of quantum computing, but is able to describe the problem and use the best solver application. The model developer should not be bothered by quantum knowledge.”
Although quantum supremacy may be some way off, companies are starting to use realworld quantum processing to solve difficult problems today. For instance, pharmaceutical and material science companies use a variety of computationalintensive methods to review molecule matches and predict the positive effects of potential therapeutic approaches while reducing negative sideeffects.
Researchers from Accenture Labs recently collaborated with Biogen to identify the quantumenabled optimisation processes most beneficial to the company. Accenture’s CarrelBilliard says such optimisations can be tested. Accenture is also working with clients in the finance sector to assess how existing algorithms run on a quantum computer. These are algorithms that already run on classical computers, and so the results they produce have already been verified.
In a similar way to how Cambridge Quantum’s Coecke looks at breaking down a problem into verifiable parts, CarrelBilliard’s team at Accenture is working on how to map certain problems to groups of mathematical primitives. These primitives are encoded in crossplatform quantum computer software developer kits and libraries. By testing the resulting programs on different quantum computer hardware architectures, it is then theoretically possible to determine whether they produce consistent results.
While quantum hardware evolves, the need to prove that the results produced are correct is only going to get harder. A solid foundation on which to build verification and explainability into these systems is as much a part of the evolutionary process as the quantum hardware itself.
Read more on IT innovation, research and development

IBM ramps up quantum education initiative
By: Cliff Saran

NUS researchers develop braininspired memory device
By: Aaron Tan

IBM demonstrates ‘120x speedup’ in target quantum application
By: Cliff Saran

AWS, QCI look to bridge classical and quantum computing
By: Ed Scannell