Quantum Machine Learning — The Next Big Thing

Deep Dutta
The Startup
Published in
8 min readJan 17, 2021

--

Learn Quantum Machine Learning from scratch.

In today’s era where quantum information science is getting popular, Quantum Machine Learning is going to be the next big thing in the field of information science and technology. Basically, Quantum Machine Learning is the combination of Quantum Computing and Machine Learning. You are going to learn it from scratch.

What is Quantum Computing?

So, Quantum Computing is basically computation that performs on Quantum Computer which cannot be done on classical computers due to computational speed or computational space and many other factors. The computers that perform quantum computation is called quantum computer and this quantum computation is done by using the quantum property such as superposition, entanglement and interference.

What are the difference between Quantum Computer and Classical Computer?

Differences between CML and QML

The basic difference between Quantum Computer and Classical Computer is Classical Computers works with bits and on the other hand Quantum Computers work with Qubits. So, if we into want to store data in classical computer it first coverts into specific combinations of 0’s and 1’s and store that binary data into bits on our hard drive. There is magnetic domain in hard drive and we have magnetic polarization and we can change magnetization to be pointing up or pointing down.

On the other hand, In Qubits it can take any combination of the binary outcome using superposition and we can think as a spin. Then we can imagine as a spin up or spin down but we can also have superposition of up and down if it is isolated enough.

What are the basic Quantum Properties?

The three basic properties that is used in Quantum Information Science are Superposition, Entanglement, Interference. Now we are going to discuss about the properties in brief.

Superposition: Already we have discussed about superposition which is not just 0 or 1. It’s in a state which is a combination of 0 and 1. We can understand this very easily with an example. Suppose I have penny and its two outcomes head and tail is assigned with 0 and 1 consecutively. And at any given point of time, if we face down the penny and ask anyone whether it’s a head or tail, we can easily give the answers whether its head or tail. That is like bits in Classical Computer. And if now we spin the penny and ask the same question, we are not able to answer this as it can be any combination of head and tail. And that is more likely Qubits.

Entanglement: In easy words, suppose we have two qubits and if we entangle them together, they become connected and then they are sort of permanently connected and then they behave in a way like a system. That’s entanglement. And they are connected in such a way that the quantum state of each particle of the pair or group cannot be described independently of the state of the others. Using an example will make it simpler. Suppose we have two pennies (think them as qubits) and they are entangled. Then if we spin two pennies individually, after stopping they should face up the same outcome (head or tail).

Interference: Think about noise-cancelling headphones. How they work? it reads the ambient wavelengths and then produce the opposite one to cancel out. They actually create interference. Interference can be of two types, Constructive Interference or Destructive Interference. In Constructive Interference we have wave amplitudes that add to the signal and it gets larger and on the Destructive Interference it forms a resultant wave of lower amplitude. So this property is used to control quantum states. It amplifies the kinds of signals that are towards right answer and cancels that are leading towards wrong answer.

What is Machine Learning?

Machine Learning is nothing but to train the machine(computer) with the help of a lots of data and make the computer find some pattern from our
data and apply the finding to the new set of data. You can think this like how a baby learns to talk. He/she hears a lot of word from our surroundings
in different situation many times and learn when to speak what day by day. It’s a continuous process, it lasts for lifetime. We also encounter
many different words for the first time and find a situation(pattern for computer) when to use that word and after that we may use that word when we encounter the same situation. So in this case we are training our brain with that data. So computer is also like a baby and it knows nothing but
to take some combination of 0 and 1 as input. So we train the computer with lots of data and use the pattern to do some task with another set of data. We can solve many different types of problem using ML. In regression we predict some values for the test set of data, in classification we can classify between different classes of data.

What is Quantum Machine Learning?

Now, as we know about both Quantum Computing and Machine Learning, We can easily understand the concept behind the Quantum Machine Learning. Quantum Machine Learning is nothing but when we do the ML computation on quantum computers or rather in quantum instances instead of classical computers. Here I am going to give a basic example of classification problem where Quantum Machine Learning overperforms Classical Machine Learning. I am going to use Support Vector Machine(SVM) algorithm here. The basic difference between classical SVM and QSVM(Quantum SVM) is classical SVM runs on classical instances and Quantum SVM runs on quantum instances. Now we are going to know about SVM in brief.

Support Vector Machine(SVM) : Suppose we are dealing with a binary classification problem. Then the objective of the support vector machine algorithm is to find a hyperplane in a higher dimensional space that distinctly classifies the data points. To separate the two classes of data points, there are many possible hyperplanes that could be chosen. Our objective will be to find a plane that has the maximum margin, i.e. the maximum distance between data points of both classes. Maximizing the margin distance provides some reinforcement so that future data points can be classified with more confidence.

QML Frameworks: As of now QML have some good frameworks. Tensorflow has its Tensorflow Quantum, IBM has Qiskit. Apart from that Pennylane has a good implementation of QML and all the libraries contains good documentation. You can go through it if want to know more. Here I am going to use qiskit for the implementation.

Quantum Machine Learning Overperforms Classical Machine Learning?

Now we are going to see an implementation where Quantum Machine Learning clearly overperforms Classical Machine Learning. For simplicity we are going use a basic dataset namely ad-hoc dataset. Here apart from all the dependencies you need in classical computer, we have to import a Quantum Simulator by importing BasicAer, A feature map rather a Quantum feature map by importing ZZFeaturemap, a Quantum Instance by importing QuantumInstance and at last the QSVM.

We will be taking feature dimension as 2 and train and test size to be 20 and 10 respectively. This is enough for our basic comparison and another reason is we don’t have enough stable Quantum Computers as of now. Apart from that we are taking a random seed and shot to be 10000. We are taking gap as 0.3 which is nothing but a gap in higher dimensional space that will separate my data. And we are also going to plot the data and label the classes.

Loading and Plotting the Dataset

So the dataset will look like this having two classes A and B labelled as 0 and 1 respectively. So clearly we can see we need a hyperplane in higher dimensional space to separate the two classes.

Have a look at the dataset

Now, to run the QSVM on our classical computer, we need to have a Quantum Simulator as a backend and a Quantum Instance that will run on my backend. So, we are having a qasm-simulator as my backend from BasicAer and having a feature map having reps 2 i.e. the number of repeated circuit(Quantum Circuit) to be 2. Then we are going to run our QSVM on Quantum Instance.

Running QSVM

After running the QVM, we can check the our kernel matrix during training. The kernel matrix is following,

QSVM Kernel Matrix

After the QSVM is trained, it’s time for predicting classes for test set of data. And you can see clearly, the QSVM classifies the both classes perfectly.

QSVM Prediction

Now, after having the accuracy of QSVM, we are going to implement the Classical SVM and will compare the accuracy of QSVM and SVM. There is same implementation of scikit-learn SVM in qiskit, we are going to use that implementation here. The kernel matrix is following,

SVM Kernel Matrix

And we can see the SVM classifies both the two classes with 65% accuracy. So, here the QML clearly overperforms ML. But why ?

SVM Prediction

Why Quantum Machine Learning overperforms Classical Machine Learning?

As we are dealing with SVM, we are going to explain it in terms of SVM. In case of classification, finding a dividing hyperplane is often only possible in higher dimensions. This involves computing the distances between the data points in higher dimensional space. So if the dimension is very large, finding the distances very computationally expensive. So the simpler thing we do is called the “Kernel Trick”. The kernel is some easily computable function that takes our data points and gives back a distance and the kernel can be optimized in order to maximize the distances between the classes of our data. Unfortunately, some kernel metrices are difficult to compute classically. So this is where Quantum Computer comes in. If the kernel can not be optimized classically, Quantum Machine Learning shows a lot of promise in being able to use the multidimensional computation space of the quantum computer in order to find the hyperplane. When data is mapped from its input dimension into the Hilbert space of the quantum computer, it naturally cast into higher dimensional space. So, the QSVM performs better than SVM.

You can find the full code implementation here,

--

--