Variational Quantum Algorithms
FULL TRANSCRIPT
let's dig into the subject of
variational quantum algorithms
so variational circuits are the
practical embodiment
of the idea that we want to train
quantum computers the same way we train
neural networks and the basic way a
variational quantum circuit is is that
if there is some quantum circuit that
forms the
basic subroutine of a larger algorithm
the quantum subroutine takes in the
state preparation or
kind of input data x and it also has
some circuit parameters
theta and then it outputs some
measurement statistics
these measurement statistics go through
some classical processing
and then you use some optimizer or some
update rule to
update the parameters in some outer
classical optimization
loop now variational circuits are also
called parametrized quantum circuits or
even quantum neural networks
so a variation of circuits consists of
the following ingredients
so first thing we want to do is prepare
some initial state psi
and as usual this is often a ground
state or a zero state or some fixed
reference state
and then we want to execute some
parametrized unitary transformation
which
breaks down to a sequence of gates and
the parametrized
is important here because that's that's
where the variational parameters come in
those are the things that we're going to
vary are the parameters of the gates
so the architecture of the circuit is
fixed but the parameters fed to the
gates are not fixed
and then to convert quantum information
back to classical information we want to
measure some particular observable
which we'll call b
so a particular variational algorithm
will contain
a few fundamental ingredients so
first you need to decide on a circuit
and sats
so an ansat is the
structure the architecture of the
circuits and
this is sometimes fixed in place for a
particular algorithm
or it's sometimes up to the user to
decide what this can be
we also need a problem-specific cost
function so this is something that
codifies a particular objective of
interest that we want to minimize or
maximize
and relates it back to the outputs of
the quantum circuits
and then we need a training algorithm
and for me i would like to really focus
on gradient descent but you could use
other training procedures if you'd like
and what the training algorithm should
do is it should take
some function computed from the output
measurements of the quantum circuit and
then
update this circuit's parameters based
on that information
so a famous example of a variational
circuit is called the variational
quantum eigenself it's one of the very
first variational algorithms
and it's in particular focused on
quantum chemistry problems so simulating
chemicals using quantum computers
so the three ingredients i said you need
are the ansats the cost function the
training
so the ansats could be something that's
very much
heavily tied to the intuition or the
chemical
or physical nature of the problem in
this case something called unitary
coupled clusters singles and doubles
it's a particular answer that's related
to quantum chemistry
we need some problem-specific cost
function and in the case of eqe
this is actually a energy measurement so
you have some
hamiltonian observable some some
output that you observe that measures
the energy of your circus
and then the training procedure and
again in this particular example i've
chosen gradient descent and i'm going to
use the perimeter shift rule
to compute the gradients and the goal is
to minimize
the cost function which is the energy so
it's going to find the minimum energy
state of a particular hamiltonian or a
particular physical system
another example is qaoa i've heard also
called kwawa but i don't really like
that's
that name it stands for quantum
approximate optimization algorithm
so in this particular case the ansats is
a very particular structure it's it's
something that comes from the
initial statement of the problem itself
and it consists of a repeated series or
a repeated layering of different
circuit sub-circuits so there's a cost
circuit which implements something
related to a cost function and then
there's a mixer
circuit or sub-circuit which implements
something which kind of jumps
or coherently moves into different
configurations
so in the case of qaoa the cost function
is something that is encoding a
optimization problem so you might have
an optimization problem
a discrete optimization problem with
many clauses that have to be satisfied
and you can encode this
into an ising type model a spin chain
type model which
can then can be converted to some
observables that you would measure on a
quantum circuit
and this particular example i'm going to
use gradient descent but instead of
using
standard optimizer i'm going to use a
shots frugal optimizer
for instance something called rosalind
so there's there's lots of different
ingredients that play here that you can
pick and choose or might be specifically
chosen by the variational algorithm
itself
so there's a number of different uh
variational algorithms out there in the
literature there's actually quite a few
and you can break them down into
different subject
areas so there's ones related to
chemistry or physics so these are
preparing quantum states that emulate
physical systems
or tell you interesting properties of
physical systems
there are variation algorithms related
to
mathematical problems such as factoring
or solving linear equations
and these can be seen as near-term
candidates to
replace things like shore's algorithm or
hhl algorithm for solving linear
systems of equations there's also
a number of variational algorithms tied
to machine learning and this is not too
surprising because variational
algorithms are inheriting a lot of
structure from machine learning so
machine learning is one of the natural
application areas
so there's quantum generative
adversarial networks there's quantum
classifiers there's
different kinds of quantum neural
networks for instance recurrent networks
or graph networks or optical
implementations or convolutional quantum
versions of neural networks
so it's quite an interesting research
area right now and there's lots of ideas
out there
and i encourage you to check out this
website at the bottom of the slide here
if you want to see some more examples
so a little bit more about the ansats
nsats is a german word
it's basically in physics it means
something like an educated guess
or an additional assumption that's kind
of chosen at the start but which is kind
of
verified as being correct throughout the
course of the problem
so in the case of variational circuits
the ansats is
the particular structure of the quantum
circuit
and as i said sometimes the structure is
completely fixed by the problem
and other times the structure is more
flexible
so some might say it's completely
selectable by the user
so in vqe in particular there's no
requirement that you have to use any
particular nsats
the only thing that's fixed is the cost
function whereas in qaoa
some of the cost function influences
the actual circuit that you're using
so the ansats is an important ingredient
and there's lots of ways to choose it
and it's still very much an open
question about
what are the best ends and states in
invasional circuits for different
problems
so again the reason for choosing a
particular and sats
uh it could be many so there could be
some intuition or some
some like logical or physical or
mathematical basis for choosing a
particular and sats
so i did mention that vqe doesn't force
you to select an assets but you might
want to choose one that we know
is likely to be similar to how actual
chemicals or chemistry systems work
again the answers can be dictated by the
structure of the problem itself
the nsats can come from intuition board
from other fields
like machine learning or the nsats can
be something that's taken in order to
make something more trainable or the
ancesto can
really be arbitrary you can use your
imagination and there's no reason to
favor one handset's over another one but
the choice of answers
will affect the quality of the model
that you're able to
learn or the quality of the answer
you're able to achieve from your vaginal
circuit
and one general piece of advice is the
deeper
you can make your own sets typically the
more
expressive it can be and the better
results you'll get
so another important thing to take into
account is
the input data so circuits don't just
have free parameters but sometimes you
also need to input data
into them in some problems it's not
necessary but other problems is
necessary
so how do we actually input classical
data
into a quantum variational circuit
there's actually a number of different
strategies you could take here
and it's still very much an open
research question
of how to embed classical data into a
quantum
circuit
so one of the simplest choices you can
make is say well
the easiest way for a parameter to enter
a circuit is through a rotation of a
single qubit so what i could do is i
could just rotate a single qubit
in proportion to the value of a of a
single data point
so a single scalar value so that's very
common
but i really want to warn people that
this is not sufficient
if you do that then the only thing your
circuit will ever
produce as a function of this input data
will be a simple sine function
so it's it's much more complicated story
and there there needs to be
still a lot of exploration done in order
to find the optimal or
reasonable ways to embed data
so just as a kind of mentioning of some
strategies that are already available
that go beyond this very simple
initial strategy is something called
data re-uploading
and the idea is not to up not to embed
data using a single rotation but
actually a sequence of repeated
rotations and maybe there's free
parameters in between those as well
and this can make a more complex
function available to you in your
circuit than you would have if you just
did a single
rotation the other idea is to have
actually a trainable
embedding layer so don't worry so much
about training the
the unitary of the circuit worry about
training the embedding and then use
standard quantum information
metrics and tricks to to classify the
data for instance so
learnable embeddings is also a very
viable strategy
so when we're using variational circuits
if you can compute the gradient using
the premiership rule then that opens up
every possible flavor of grady descent
that you could want so there's standard
grading descent but also in
deep learning there's all sorts of other
great instant optimizers that are
available to you
ones that you see most common are
momentum and atom
probably but there's also a number of
quantum aware optimizers that you could
use and these are things that
inherently take into account that you're
optimizing a quantum circuit
and not just a black box so for instance
i've put three different examples here
of
quantum aware optimizers so one is
it's actually a pair of optimizers
they're called rotosolve and rotoselect
they're in the same family
these actually don't use gradients at
all so instead of using the gradient
they recognize that there's this
sinusoidal structure to quantum circuits
and if you're only ever looking in one
direction in parameter space you can
actually find a minimum quite easily
just by going to the minimum of that
particular sinusoid
and then you can iterate through that
process many times and hope that you can
find
via a sequence of individual jumps to
local minima
eventually end up in a global minimum
another cool quantum ware optimizer is
called quantum natural gradient
and the basic idea here is that the
inherent geometry of quantum circuits of
quantum
computing circuits and quantum physical
systems
is not euclidean so it doesn't look like
the world around us is not like this
rectilinear structure it's more of a
it's more of a sinusoidal structure so
we should take that into account
we should adjust for the inherent
geometry of the
space that we're optimizing in and then
there's a family of optimizers that are
called
shots frugal and what these do is
recognize that in current day
quantum computers the number of circuit
executions is actually a very precious
commodity and if you're having to wait
in a queue online is this can really
slow down your optimization so
these optimizers are much more frugal in
how they use optimize
how they optimize using shots and they
they rely on a lot
smaller number of shots especially
earlier on in training to get
estimates that you need to train the
circuit
final thing i want to mention and this
is a topic that people have probably
heard of before is there's this notion
of baron plateaus
for variational circuits the basic idea
here and it's it's similar to something
that happened in deep learning is that
there are parts of the optimization
landscape where the gradient is zero
and anywhere you go around it the
gradient is also zero
so they're very flat and so it's hard to
use a gradient descent strategy because
everything just looks like you're
completely flat in every direction so
baron plateaus actually come from a
number of different effects there's not
just one effect but there's multiple
ways to do it they can come from the
choice of your circuit hand stats they
can come from the choice of
parameterization
or parameter values or they could come
from your cost function
and there's a number of different
proposals for overcoming these
uh but it's still an open question i
would say to how to avoid these
in general so you could use a
specialized initialization strategy
you could go with a layer-wise expansion
of your circuit bit by bit and
try to avoid it by making the shortcut a
circuit short at the earlier
stages of training or you can go with
adiabatic type approaches where you'll
have a very
kind of slow evolution towards a
targeted goal
and these are nice pictures that
illustrate the bearing plateau
phenomenon where
if you're sitting anywhere in this
landscape except right at this
center you really can't look in any
direction and see anything with flatness
so brother-in-laws are an interesting
barrier that we'll have to overcome in
order to train
variational quantum circuits
you
UNLOCK MORE
Sign up free to access premium features
INTERACTIVE VIEWER
Watch the video with synced subtitles, adjustable overlay, and full playback control.
AI SUMMARY
Get an instant AI-generated summary of the video content, key points, and takeaways.
TRANSLATE
Translate the transcript to 100+ languages with one click. Download in any format.
MIND MAP
Visualize the transcript as an interactive mind map. Understand structure at a glance.
CHAT WITH TRANSCRIPT
Ask questions about the video content. Get answers powered by AI directly from the transcript.
GET MORE FROM YOUR TRANSCRIPTS
Sign up for free and unlock interactive viewer, AI summaries, translations, mind maps, and more. No credit card required.