One of the ways that computers ‘think’ is by analysing relationships within large sets of data. An international team has shown that quantum computers can do one such analysis faster than classical computers, for a wider array of data types than was previously expected.
The team’s proposed ‘quantum linear system algorithm’ is published in the 2 February issue of Physical Review Letters. In the future, it could help crunch numbers on problems as varied as commodities pricing, social networks and chemical structures.
“The previous quantum algorithm of this kind applied to a very specific type of problem. We need an upgrade if we want to achieve a quantum speed up for other data,” says Zhikuan Zhao, corresponding author on the work.
That’s exactly what he’s offering, in joint work with colleague Anupam Prakash at the Centre for Quantum Technologies, National University of Singapore, and collaborator Leonard Wossnig, then at ETH Zurich and the University of Oxford. Zhao is a PhD student with the Singapore University of Technology and Design.
The first quantum linear system algorithm was proposed in 2009 by a different group of researchers. That algorithm kick-started research into quantum forms of machine learning, or artificial intelligence.
A linear system algorithm works on a large matrix of data. For example, a trader might be trying to predict the future price of goods. The matrix may capture historical data about price movements over time and data about features that could be influencing these prices, such as currency exchange rates. The algorithm calculates how strongly each feature is correlated with another by ‘inverting’ the matrix. This information can then be used to extrapolate into the future.
“There is a lot of computation involved in analysing the matrix. When it gets beyond say 10,000 by 10,000 entries, it becomes hard for classical computers,” explains Zhao. This is because the number of computational steps goes up rapidly with the number of elements in the matrix: every doubling of the matrix size increases the length of the calculation eight-fold.
The 2009 algorithm could cope better with bigger matrices, but only if the data in them is what’s known as ‘sparse’. In these cases, there are limited relationships among the elements, which is often not true of real-world data.
Zhao, Prakash and Wossnig present a new algorithm that is faster than both the classical and the previous quantum versions, without restrictions on the kind of data it works for.
As a rough guide, for a 10,000 square matrix, the classical algorithm would take on the order of a trillion computational steps, the first quantum algorithm some 10,000s of steps and the new quantum algorithm just 100s of steps. The algorithm relies on a technique known as quantum singular value estimation.
There have been a few proof-of-principle demonstrations of the earlier quantum linear system algorithm on small-scale quantum computers. Zhao and his colleagues hope to work with an experimental group to run a proof-of-principle demonstration of their algorithm, too. They also want to do a full analysis of the effort required to implement the algorithm, checking what overhead costs there may be.
To show a real quantum advantage over the classical algorithms will need bigger quantum computers. Zhao estimates that “We’re maybe looking at three to five years in the future when we can actually use the hardware built by the experimentalists to do meaningful quantum computation with application in artificial intelligence.”
https://www.sciencedaily.com/releases/2018/02/180202112637.htm
Source link