Given an unknown signal x ∈ ℝ and linear noisy measurements y = Ax + σv ∈ ℝ, the generalized equation-LASSO solves equation. Here, ƒ is a convex regularization function (e.g. ℓ-norm, nuclear-norm) aiming to promote the structure of x (e.g. sparse, low-rank), and, λ ≥ 0 is the regularizer parameter. A related optimization problem, though not as popular or well-known, is often referred to as the generalized ℓ-LASSO and takes the form equation, and has been analyzed by Oymak, Thrampoulidis and Hassibi. Oymak et al. further made conjectures about the performance of the generalized equation-LASSO. This paper establishes these conjectures rigorously. We measure performance with the normalized squared error equation. Assuming the entries of A are i.i.d. Gaussian N(0, 1/m) and those of v are i.i.d. N(0, 1), we precisely characterize the “asymptotic NSE” aNSE :=lim NSE(σ) when the problem dimensions tend to infinity in a proportional manner. The role of λ, ƒ and x is explicitly captured in the derived expression via means of a single geometric quantity, the Gaussian distance to the subdifferential. We conjecture that aNSE = sup NSE(σ). We include detailed discussions on the interpretation of our result, make connections to relevant literature and perform computational experiments that validate our theoretical findings.
展开▼