Speaker
Simon Chebykin
(MIPT)
Description
In this work we study upper bounds for optimal convergence rates in stochastic optimization with smooth strongly convex objective function and zero-order unbiased oracle with bounded markovian noise. We adapt randomized accelerated GD to zero-order one-point oracle and provide argument and function convergence rates matching best known non-markovian ones.
Primary authors
Aleksandr Beznosikov
(Moscow Institute of Physics and Technology)
Simon Chebykin
(MIPT)