The Randomized Block Coordinate Descent Method in the Hölder Smooth Setting
math.OC
/ Authors
/ Abstract
This work provides the first convergence analysis for the Randomized Block Coordinate Descent method for minimizing a function that is both Hölder smooth and block Hölder smooth. Our analysis applies to objective functions that are non-convex, convex, and strongly convex. For non-convex functions, we show that the expected gradient norm reduces at an $O\left(k^{\fracγ{1+γ}}\right)$ rate, where $k$ is the iteration count and $γ$ is the Hölder exponent. For convex functions, we show that the expected suboptimality gap reduces at the rate $O\left(k^{-γ}\right)$. In the strongly convex setting, we show this rate for the expected suboptimality gap improves to $O\left(k^{-\frac{2γ}{1-γ}}\right)$ when $γ>1$ and to a linear rate when $γ=1$. Notably, these new convergence rates coincide with those furnished in the existing literature for the Lipschitz smooth setting.