Nonconvex Proximal Incremental Aggregated Gradient Method with Linear Convergence
/ Authors
/ Abstract
In this paper, we study the proximal incremental aggregated gradient algorithm for minimizing the sum of L-smooth nonconvex component functions and a proper closed convex function. By exploiting the L-smooth property and using an error bound condition, we can show that the method still enjoys some desired linear convergence properties, even for nonconvex minimization. Actually, we show that the generated iterative sequence globally converges to the stationary point set. Moreover, we give an explicit computable stepsize threshold to guarantee that both the objective value and iterative sequences are R-linearly convergent.
Journal: Journal of Optimization Theory and Applications