The Variational Quantum Eigensolver
An unsung hero of approximate quantum computing
Over the last year, the quantum computing community has made incredible progress on many of the things that are needed for quantum computing to start having an impact on real-world problems. This spans hardware (including the free and public IBM Q experience 5- and 16-qubit devices, and our recent 20Q and 50Q commercial announcement), software (with the release of Quantum Information Software Kit (Qiskit) – and other quantum programming packages), theory, characterization techniques, and methods for mitigating quantum errors. In particular, new theory now lets us examine what can be done with near-term hardware, giving us all the more evidence that we’re on the path to achieving a quantum advantage for real-world applications.
The IBM Q team, and much of the industry agrees, that the three nearest-term applications for quantum computing will be chemistry [1,2,3,4], optimization [5,6,7], and machine learning [8,9,10]. Over the last year, we have demonstrated progress towards all three on actual quantum hardware [4,9]. At least in the case of chemistry and optimization, significant progress with near-term quantum hardware has been driven by an algorithm called the Variational Quantum Eigensolver (VQE) , which is hybrid between classical and quantum computing. A classical computer varies some experimental parameters that control the preparation of a quantum state, and then a quantum computer prepares that state and calculates its properties.
Below is a rough description of how it works, but first a definition of three key terms:
- wavefunction: a mathematical description of a quantum state
- Hamiltonian: a quantum energy operator that describes the total energy of a quantum system
- Quantum gates: operations performed on qubits that manipulate their quantum states
Here’s what it looks like to use VQE to calculate bond length in a molecule (e.g. BeH2) using a quantum computer:
- Transform the Hamiltonian of the molecule to a qubit Hamiltonian. This means taking your representation of the electronic orbital interactions in the molecule and figuring out how to mimic key parts of it in the qubit system. Qualitatively, you can think of the interactions among electronic orbitals in the molecules as being captured by our ability to create entanglement in the qubit system. The larger the molecule you are trying to simulate, the more electronic orbitals you have, so the more qubits you need.
- Pick a “trial wavefunction,” or trial state, and encode it onto the quantum computer. Imagine that this trial state is a guess to the electronic configuration (since you don’t know the answer for the real molecule yet) of BeH2 at a given inter-atomic distance. Create a quantum state on the processor that represents that particular version of the BeH2 wavefunction by using a combination of entangling gates, single-qubit gates, and your choice of circuit depth (the number of sequential operations you can do, constrained by the available hardware).
- Estimate the energy of the trial state. This is done by measuring aspects of the quantum state you created in the previous step. Given what you know about the molecule’s Hamiltonian, you can relate this back to an energy in the molecule for that given electronic configuration.
- Feed this energy to an optimizer that is run on a classical computer. The optimizer then generates a new set of control parameters that create a new trial wavefunction on the quantum computer with lower energy. Rinse and repeat until the energy converges to the lowest value; this final energy corresponds to the solution to the ground state energy for the interatomic spacing you tried. (In the case of BeH2, the molecule is small enough to compare against results on a conventional computer.)
- Repeat steps 2-4 for Hamiltonians corresponding to different inter-atomic spacings. The Hamiltonian with the least energy would then correspond to the equilibrium configuration, and voila! You know the bond length. The ability to use a quantum computer to perform tasks that are typically hard for classical computers (like creating a trial state and measuring its energy), is an important part of why this approach is so promising. Of course, it’s early days and there are many hurdles to overcome, but, it’s an exciting start and we expect this approach will scale.
The cool thing about VQE is how versatile it is! Optimization problems can also be run on a quantum computer using VQE. Instead of the energy of the molecule, you’d represent the “cost function” (i.e. the thing you are trying to minimize or maximize) as a qubit Hamiltonian, which you would then address using the quantum computer. The rest of the steps would be the same: you would have some parameters you would vary, each of which influence the cost, and for each trial state you would measure the cost.
Of course, it sounds straightforward, but the devil is in the details. If you take away one thing from this blog, I hope it’s this: “science lets you do some pretty cool stuff!” To see it in action, check out the Jupyter notebooks we published that show you how to implement chemical bond length simulations (e.g. for H2 or Li-H) or Max Cut (and travelling salesman) optimization problems using VQE in QISKit.
 Nature Chemistry 2, 106–111 (2010)  Nature Communications 5, 4213 (2014)  Phys. Rev. X 6, 031007 (2016)  Nature 549, 242–246 (2017)  arXiv:1411.4028  arXiv:1710.01022  arXiv:1703.06199  Phys. Rev. A 92, 012327 (2015)  npj Quantum Information 3, Article number: 16 (2017), doi:10.1038/s41534-017-0017-3  Nature 549, 195–202 (2017)