Image: S2S4 Conference Page on Facebook

Avoiding local minima in variational quantum eigensolvers with the natural gradient optimizer is the title of a recent publication in Focus Area 3.2.

In the paper, David Wierichs (PhD student at the Institute for Theoretical Physics Cologne), Christian Gogolin (co-affiliated to Covestro Deutschland AG) and Michael Kastoryano (meanwhile at Amazon Quantum Solutions Lab and AWS Center for Quantum Computing) compare the BFGS optimizer, ADAM and Natural Gradient Descent (NatGrad) in the context of Variational Quantum Eigensolvers (VQEs).

The authors report how the Natural Gradient Optimizer shows most reliable convergence for the Variational Quantum Eigensolver across multiple spin chain systems. It does so at competitive cost and solves application-relevant challenges of the optimization task.

Browse the ML4Q News Archive