Projected Nesterov's Proximal-Gradient Algorithms for Sparse Signal Reconstruction with a Convex Constraint

Renliang Gu and Aleksandar Dogandžić

\( \newcommand{\b}[1]{{\boldsymbol{#1}}} \newcommand{\bm}[1]{{\boldsymbol{#1}}} \newcommand{\bx}{{\bm{x}}} \newcommand{\cI}{\mathcal{I}} \newcommand{\cL}{\mathcal{L}} \newcommand{\dif}{\rm d} \newcommand{\balpha}{\b{\alpha}} \newcommand{\bcI}{\b{\mathcal{I}}} \newcommand\norm[1]{\lVert #1 \rVert} \newcommand{\bpsi}{\bm{\psi}} \newcommand\SBR[1]{\left[#1\right]} \)

Summary

We propose a method to solve the following optimization problem

\[ \begin{equation} \label{eq:f} \text{minimize}\qquad f(\bx)=\cL(\bx) +u r(\bx) \end{equation} \]

with respect to the signal \(\bx\), where \(\cL(\bx)\) is a convex differentiable data-fidelity (NLL) term, \(u>0\) is a scalar tuning constant that quantifies the weight of the convex regularization term \(r(\bx)\) that imposes signal sparsity and the convex-set constraint:

\[ \begin{equation} \label{eq:r} r(\bx)=\norm{\Psi^H\bx}_1 + \mathbb{I}_C(\bx) \end{equation} \]

where \(C\) is a closed convex set and \(\Psi^H \bx\) is a signal transform-coefficient vector with most elements having negligible magnitudes. Real-valued \(\Psi \in \mathbb{R}^{p\times p’}\) can accommodate discrete wavelet transform (DWT) or gradient-map sparsity with anisotropic total-variation (TV) sparsifying transform (with \(\Psi=[\Psi_\text{v}\,\,\Psi_\text{h}]\)); a complex-valued \(\Psi = \Psi_\text{v} + j \Psi_\text{h} \in \mathbb{C}^{p\times p’}\) can accommodate gradient-map sparsity and the 2D isotropic TV sparsifying transform; here \(\Psi_\text{v}, \Psi_\text{h} \in \mathbb{R}^{p\times p’}\) are the vertical and horizontal difference matrices.

This work is supported by the National Science Foundation under Grant CCF-1421480.

We proposed to use an adaptive step size strategy and , inexactness of the iterative proximal mapping, and the convex-set constraint.

local Lipschitz 
step size versus iteration index 

Download

Clone or press “Download ZIP” button in the right panel of PNPG's GitHub page.

Reference:

@STRING{IEEE_J_SP         = "{IEEE} Trans. Signal Process."}
@article{gd16convexsub,
  Author = {Renliang Gu and Aleksandar Dogand\v{z}i\'c},
  Title = {Projected {N}esterov's Proximal-Gradient Algorithm for Sparse Signal 
    Recovery},
  Journal = IEEE_J_SP,
  Volume = 65,
  Number=13,
  pages={3510-3525},
  doi={10.1109/TSP.2017.2691661},
  Year = 2017
}
@string{asilomar = {{Proc. Asilomar Conf. Signals, Syst. Comput.}}}
@inproceedings{gdasil15,
    Address = {Pacific Grove, CA},
    Author = {Renliang Gu and Aleksandar Dogand\v{z}i\'c},
    Booktitle = Asilomar,
    Month = nov,
    Title = {Projected {N}esterov's Proximal-Gradient Signal Recovery from 
      Compressive {P}oisson Measurements}, 
    Year = {2015},
    pages={1490-1495}
}

Our code has been used by