...
/Implementing of Variational Hybrid Quantum Classical Algorithm
Implementing of Variational Hybrid Quantum Classical Algorithm
Learn how we can implement the variational hybrid quantum-classical algorithm.
The scores of the random quantum classifier
# Tell Qiskit how to simulate our circuitbackend = Aer.get_backend('statevector_simulator')classifier_report("Variational",run,lambda passenger: post_process(pqc(backend, pre_process(passenger))),train_input,train_labels)
In line 2, we first create the statevector_simulator
backend, which we can reuse for all our predictions.
We use the classifier_report
wrapping function we
developed in the lesson Unmask the Hypocrite Classifier.
Besides an arbitrary name in the output in line 5, the primary input is the classifier we provide in line 6.
We provide an anonymous lambda
function, which is a function without a name, as our classifier. It takes a single parameter passenger
and runs, from inner to outer, the pre_process
function with the passenger
as a parameter. Finally, we put the result alongside the backend
into the pqc
a function whose result we put into the post_process
function.
When we run the pqc
classifier with the initial state, we can see that it yields identical scores as the random classifier.
Now, it’s finally time to build a real classifier, one that uses the actual passenger data to predict whether the passenger survived the Titanic shipwreck or not.
Let’s start at the end. The current post-processing already returns either 0
or 1
. This fits our required output since 0
represents a passenger who died, and 1
represents a passenger who survived.
The current pqc
measured the provided quantum state vector and returned the counts
. We could leave it unchanged if we provided input to a vector whose probability corresponds to the passenger’s actual likelihood to survive.
The passenger data consists of an array of seven features. We have
already transformed all features into numbers between 0
and 1
in the lesson Data preparation and cleaning.
Thus, the pre-processing task is to translate these seven numbers into a quantum state vector whose probability corresponds to the passenger’s actual survival.
Finding such a probability is the innate objective of any machine learning algorithm.
Our data consists of seven features. The central assumption is that these features determine or at least affect whether a passenger survived or not. If that weren’t the case, we wouldn’t be able to predict anything reliably. So let’s assume the features determine survival.
The question is, how do these seven features determine survival? Is one feature more important than another? Is there a direct relationship between a feature and survival? Are ...