New quantum algorithm may help AI to predict future: study

Source: Xinhua| 2018-02-03 02:22:21|Editor: yan
Video PlayerClose

WASHINGTON, Feb. 2 (Xinhua) -- An international team has shown that quantum computers can analyze relationships within large sets of data faster than classical computers, for a wider array of data types than previously expected.

The study, published on Friday in American journal Physical Review Letters, proposed a "quantum linear system algorithm" that could conduct quantum computation with application in artificial intelligence (AI) and crunch numbers on problems such as commodities pricing, social networks and chemical structures.

The linear system algorithm, first proposed in 2009, works on a large matrix of data and has kick-started research into quantum forms of machine learning, or artificial intelligence.

"There is a lot of computation involved in analyzing the matrix. When it gets beyond say 10,000 by 10,000 entries, it becomes hard for classical computers," explained Zhao Zhikuan, the paper's corresponding author from Singapore University of Technology and Design.

Zhao also said the previous quantum algorithm of this kind can be applied to a very specific type of problem. This is because the number of computational steps goes up rapidly with the number of elements in the matrix: every doubling of the matrix size increases the length of the calculation eight-fold.

Zhao and his colleague in Singapore, Switzerland and the United Kingdom, present a new algorithm that is faster than both the classical and the previous quantum versions.

It relies on a technique known as quantum singular value estimation and can cope with data not limited to "sparse" ones as required by the previous versions.

There are limited relationships among the elements in sparse data, but it is often not true of real-world data, Zhao said.

For example, a trader might be trying to predict the future price of goods. The matrix may capture historical data about price movements over time and data about features that could be influencing these prices, such as exchange rates.

Then the algorithm calculates how strongly each feature is correlated with another by "inverting" the matrix. This information can then be used to extrapolate into the future.

But to show a real quantum advantage over the classical algorithms will need bigger quantum computers.

Zhao estimates that it still need additional three to five years when they can actually use the hardware to do meaningful quantum computation with application in artificial intelligence.

TOP STORIES
EDITOR’S CHOICE
MOST VIEWED
EXPLORE XINHUANET
010020070750000000000000011105521369454851