On the resolution of $\ell_1$-norm minimization via a two-metric adaptive projection method
math.OC
/ Abstract
In this work, we propose an efficient two-metric adaptive projection method for solving the $\ell_1$-norm minimization problem. Our approach is inspired by the two-metric projection method, a simple yet elegant algorithm proposed by Bertsekas for bound/box-constrained optimization problems. The low per-iteration cost of this method, combined with the ability to incorporate Hessian information, makes it particularly attractive for large-scale problems, and our proposed method inherits these advantages. Previous attempts to extend the two-metric projection method to $\ell_1$-norm minimization rely on an intermediate reformulation as a bound-constrained problem, which can lead to numerical instabilities in practice, in sharp contrast to our approach. Our algorithm features a refined partition of the index set, an adaptive projection, and a novel linesearch rule. It can accommodate singular Hessians as well as inexact solutions to the Newton linear system for practical implementation. We show that our method is theoretically sound - it has global convergence. Moreover, it is an active-set method capable of manifold identification: the underlying low-dimensional structure can be identified in a finite number of iterations, after which the algorithm reduces to an unconstrained Newton method on the identified subspace. Under an Error Bound condition, the method attains a locally superlinear convergence rate. Hence, when the solution is sparse, it achieves superfast convergence in terms of iterations while maintaining scalability, making it well-suited for large-scale problems. We conduct extensive numerical experiments to demonstrate the practical advantages of the proposed algorithm over several competitive methods from the literature, particularly in large-scale settings. }