Fig. 1: An illustration of two iterations in SGLBO for minimizing a 2D cost function. | npj Quantum Information

Fig. 1: An illustration of two iterations in SGLBO for minimizing a 2D cost function.

From: Stochastic gradient line Bayesian optimization for efficient noise-robust optimization of parameterized quantum circuits

Fig. 1: An illustration of two iterations in SGLBO for minimizing a 2D cost function.

a The figure represents the updating procedure of SGLBO on the landscape of the cost function. In particular, in the first iteration, at an initial point 1, we estimate a direction of the gradient of the cost function based on SGD and perform BO on the 1D subspace in this direction to estimate the optimal step size. b Then, we reach point 2 from the point 1 by moving in the estimated direction by the estimated optimal step size. c Next, at point 2, we perform the same procedure of estimating the gradient based on the SGD and estimating the optimal step size by the BO on the line of the 1D subspace, to move from point 2 to point 3. We iterate these procedures until SGLBO converges or consumes a preset number of measurement shots. After all these iterations, SGLBO returns a suffix average over the points visited in the iterations as an output.

Back to article page