#  Quantum Machine Learning: An Overview

Quantum Machine Learning (Quantum ML) is the interdisciplinary area combining Quantum Physics and Machine Learning(ML). It is a symbiotic association- leveraging the power of Quantum Computing to produce quantum versions of ML algorithms, and applying classical ML algorithms to analyze quantum systems. Read this article for an introduction to Quantum ML.

At a recent conference in 2017, Microsoft CEO Satya Nadella used the analogy of a corn maze to explain the difference in approach between a classical computer and a quantum computer. In trying to find a path through the maze, a classical computer would start down a path, hit an obstruction, backtrack; start again, hit another obstruction, backtrack again until it ran out of options. Although an answer can be found, this approach could be very time-consuming.

In contrast, quantum computers “unlock amazing parallelism. They take every path in the corn maze simultaneously.” Thus, leading to an exponential reduction in the number of steps required to solve a problem.

The parallelism comes from the concept of ‘qubit’, 'superposition' and 'entanglement' derived from Quantum Physics.

I. Quantum Computing:

a) ‘Quantum’:

Quantum is the smallest possible unit of any physical entity, such as energy or mass. In 1900, Max Planck proposed that, at the atomic and subatomic level, energy of a body is contained in discrete packets called quanta’.

Wave-particle duality is the characteristic of quantic particles to behave as a wave sometimes and as a particle the other times, depending on the environment. Quantum theory is characterized by finding the probability of, and not the exact location of, a particle at a given point x in space. Fig 1: The dual nature of light, which acts like both particles and waves. (Source)

b) Qubit:

A classical computer performs operations using classical ‘bits’, which are either 0 OR 1. However, a quantum computer uses quantum bits, also called ‘qubits’ to perform operations.

Qubits can be represented by:

• An electron orbiting a nucleus: where |1> and |0> are the excited state and ground state respectively
• A photon: where |1> and |0> are polarizations of the photon.

c) Superposition:

Qubits exist as both 0 AND 1 at the same time. This phenomenon is called ‘superposition’.

Although a particle can exist in multiple quantum states, once we measure that particle for its energy or position, its superposition is lost and it then exists in only one state. Fig2 : The qubit is defined as a pair of complex vectors pointing to a spot on a unit sphere. Traditionally, a qubit pointing directly up (positive on the axis) is denoted as the column vector |0⟩ and the vector pointing down is known as |1⟩. (For example, in this case, the qubit is |0⟩).
(Source)

d) Entanglement:

‘Quantum entanglement’ is the phenomenon in which quantum particles interact with each other and are described with reference to each other, not independently, even if the particles are separated by a large distance.

At the time of measurement, if one entangled particle in a pair is decided to be in the spin state of ‘down’ (that is, the lowest energy state; when the electron is in alignment with its magnetic field), then this decision is communicated to the other correlated particle that now assumes the opposite spin state of ‘up’. Quantum entanglement allows qubits, including those faraway, to interact instantaneously with each other.

How does Quantum computing unlock immense parallelism?

Two interacting classical bits can take one of 4 forms: 00 or 01 or 10 or 11. Each of these 2 components of information- the first bit and the second bit, combine to represent only one binary configuration at a given time. Adding more bits to a regular computer would still represent a single binary configuration. Fig3: One qubit in superposition before measurement, with its probabilities of ‘spin-up’ AND ‘spin-down'. (Source)

One qubit can exist in both states (0 AND 1) at once. Thus, two interacting qubits can store all 4 binary configurations simultaneously. In general, ‘n’ qubits can simultaneously represent ‘2^n’ classical binary configurations. Thus, a 300–qubit quantum computer can explore 2^300 possible solutions at the same time, unlike 1 solution at a time in a classical computer, causing immense parallelism. Adding more qubits to a quantum computer would exponentially increase the power of the computer.

A fully quantum computer has not yet been realized because adding more qubits and dealing with subatomic particles that require a low temperature of -452 F in order to be stable, is daunting and building a computer around that (a ‘quantum computer’), even more so. Thus, efforts are on to ‘simulate’ 40 qubit operations using Microsoft’s quantum simulator- LIQUi|> , extended by Microsoft Azure’s cloud computing resources.

Quantum Computing can solve specialized scientific problems such as molecular modelling, creation of high-temperature superconductors, drug modelling and testing, selection of molecules for the creation of organic batteries. It is not optimal for general-purpose tasks such as for watching videos or writing a Word document.

Now, how does Quantum Computing fit in with Machine Learning?

II. Quantum ML:

2a) Quantum versions of ML algorithms

• Finding eigenvalues and eigenvectors of large matrices:

One of the methods to perform the classical PCA algorithm is to take the eigenvalue decomposition of a data covariance matrix. However, this is not so efficient in case of high dimensional data.

Quantum PCA of an unknown low-rank density matrix, can reveal the quantum eigenvectors associated with the large eigenvalues, exponentially faster than a linearly-scaled classical algorithm.

• Finding nearest neighbours on a quantum computer:

The quantum algorithms  presented here for computing nearest neighbours that are used in supervised and unsupervised learning, place an upper bound on the number of queries to the input data required to compute distance metrics such as the Euclidean distance and inner product. The best cases show exponential and super-exponential reductions in query complexity and the worst case still shows polynomial reduction in query complexity over the classical analogue. Get the FREE ebook 'The Great Big Natural Language Processing Primer' and the leading newsletter on AI, Data Science, and Machine Learning, straight to your inbox.  