Startseite A projected gradient method for nonlinear inverse problems with 𝛼ℓ1 − 𝛽ℓ2 sparsity regularization
Artikel
Lizenziert
Nicht lizenziert Erfordert eine Authentifizierung

A projected gradient method for nonlinear inverse problems with 𝛼ℓ1 − 𝛽ℓ2 sparsity regularization

  • Zhuguang Zhao und Liang Ding EMAIL logo
Veröffentlicht/Copyright: 25. Juli 2023

Abstract

The non-convex α 1 β 2 ( α β 0 ) regularization is a new approach for sparse recovery. A minimizer of the α 1 β 2 regularized function can be computed by applying the ST-( α 1 β 2 ) algorithm which is similar to the classical iterative soft thresholding algorithm (ISTA). Unfortunately, It is known that ISTA converges quite slowly, and a faster alternative to ISTA is the projected gradient (PG) method. Nevertheless, the current applicability of the PG method is limited to linear inverse problems. In this paper, we extend the PG method based on a surrogate function approach to nonlinear inverse problems with the α 1 β 2 ( α β 0 ) regularization in the finite-dimensional space R n . It is shown that the presented algorithm converges subsequentially to a stationary point of a constrained Tikhonov-type functional for sparsity regularization. Numerical experiments are given in the context of a nonlinear compressive sensing problem to illustrate the efficiency of the proposed approach.

MSC 2010: 49M37; 65K05

Award Identifier / Grant number: 2572021DJ03

Award Identifier / Grant number: LBH-Q16008

Award Identifier / Grant number: 41304093

Funding statement: The work of the second author was supported by the Fundamental Research Funds for the Central Universities (no. 2572021DJ03), Heilongjiang Postdoctoral Research Developmental Fund (no. LBH-Q16008) and the National Natural Science Foundation of China (no. 41304093).

References

[1] A. Beck and Y. C. Eldar, Sparsity constrained nonlinear optimization: optimality conditions and algorithms, SIAM J. Optim. 23 (2013), no. 3, 1480–1509. 10.1137/120869778Suche in Google Scholar

[2] M. Benning and M. Burger, Modern regularization methods for inverse problems, Acta Numer. 27 (2018), 1–111. 10.1017/S0962492918000016Suche in Google Scholar

[3] T. Blumensath, Compressed sensing with nonlinear observations and related nonlinear optimization problems, IEEE Trans. Inform. Theory 59 (2013), no. 6, 3466–3474. 10.1109/TIT.2013.2245716Suche in Google Scholar

[4] T. Blumensath and M. E. Davies, Iterative thresholding for sparse approximations, J. Fourier Anal. Appl. 14 (2008), no. 5–6, 629–654. 10.1007/s00041-008-9035-zSuche in Google Scholar

[5] T. Blumensath and M. E. Davies, Iterative hard thresholding for compressed sensing, Appl. Comput. Harmon. Anal. 27 (2009), no. 3, 265–274. 10.1016/j.acha.2009.04.002Suche in Google Scholar

[6] K. Bredies and D. A. Lorenz, Iterated hard shrinkage for minimization problems with sparsity constraints, SIAM J. Sci. Comput. 30 (2008), no. 2, 657–683. 10.1137/060663556Suche in Google Scholar

[7] S. P. Chepuri and G. Leus, Sparsity-promoting sensor selection for non-linear measurement models, IEEE Trans. Signal Process. 63 (2015), no. 3, 684–698. 10.1109/TSP.2014.2379662Suche in Google Scholar

[8] I. Daubechies, M. Defrise and C. De Mol, An iterative thresholding algorithm for linear inverse problems with a sparsity constraint, Comm. Pure Appl. Math. 57 (2004), no. 11, 1413–1457. 10.1002/cpa.20042Suche in Google Scholar

[9] I. Daubechies, M. Defrise and C. De Mol, Sparsity-enforcing regularisation and ISTA revisited, Inverse Problems 32 (2016), no. 10, Article ID 104001. 10.1088/0266-5611/32/10/104001Suche in Google Scholar

[10] I. Daubechies, M. Fornasier and I. Loris, Accelerated projected gradient method for linear inverse problems with sparsity constraints, J. Fourier Anal. Appl. 14 (2008), no. 5–6, 764–792. 10.1007/s00041-008-9039-8Suche in Google Scholar

[11] L. Ding and W. Han, α 1 - β 2 regularization for sparse recovery, Inverse Problems 35 (2019), no. 12, Article ID 125009. 10.1088/1361-6420/ab34b5Suche in Google Scholar

[12] L. Ding and W. Han, α 1 β 2 sparsity regularization for nonlinear ill-posed problems, preprint (2020), https://arxiv.org/abs/2007.11377. Suche in Google Scholar

[13] L. Ding and W. Han, A projected gradient method for α 1 β 2 sparsity regularization, Inverse Problems 36 (2020), no. 12, Article ID 125012. 10.1088/1361-6420/abc857Suche in Google Scholar

[14] M. Fornasier, Numerical methods for sparse recovery, Theoretical Foundations and Numerical Methods for Sparse Recovery, Radon Ser. Comput. Appl. Math. 9, Walter de Gruyter, Berlin (2010), 93–200. 10.1515/9783110226157.93Suche in Google Scholar

[15] M. Fornasier and H. Rauhut, Iterative thresholding algorithms, Appl. Comput. Harmon. Anal. 25 (2008), no. 2, 187–208. 10.1016/j.acha.2007.10.005Suche in Google Scholar

[16] X.-L. Huang, L. Shi and M. Yan, Nonconvex sorted 1 minimization for sparse approximation, J. Oper. Res. Soc. China 3 (2015), no. 2, 207–229. 10.1007/s40305-014-0069-4Suche in Google Scholar

[17] B. Jin and P. Maass, Sparsity regularization for parameter identification problems, Inverse Problems 28 (2012), no. 12, Article ID 123001. 10.1088/0266-5611/28/12/123001Suche in Google Scholar

[18] B. Jin, P. Maaß and O. Scherzer, Sparsity regularization in inverse problems [preface], Inverse Problems 33 (2017), no. 6, Article ID 060301. 10.1088/1361-6420/33/6/060301Suche in Google Scholar

[19] D. Lazzaro, E. L. Piccolomini and F. Zama, A nonconvex penalization algorithm with automatic choice of the regularization parameter in sparse imaging, Inverse Problems 35 (2019), no. 8, Article ID 084002. 10.1088/1361-6420/ab1c6bSuche in Google Scholar

[20] P. Li, W. Chen, H. Ge and M. K. Ng, 1 α 2 minimization methods for signal and image reconstruction with impulsive noise removal, Inverse Problems 36 (2020), no. 5, Article ID 055009. 10.1088/1361-6420/ab750cSuche in Google Scholar

[21] Y. Lou and M. Yan, Fast L1–L2 minimization via a proximal operator, J. Sci. Comput. 74 (2018), no. 2, 767–785. 10.1007/s10915-017-0463-2Suche in Google Scholar

[22] L. B. Montefusco, D. Lazzaro and S. Papi, A fast algorithm for nonconvex approaches to sparse recovery problems, Signal Proc. 93 (2013), 2636–2647. 10.1016/j.sigpro.2013.02.018Suche in Google Scholar

[23] R. Ramlau and C. A. Zarzer, On the minimization of a Tikhonov functional with a non-convex sparsity constraint, Electron. Trans. Numer. Anal. 39 (2012), 476–507. Suche in Google Scholar

[24] O. Scherzer, M. Grasmair, H. Grossauer, M. Haltmeier and F. Lenzen, Variational Methods in Imaging, Appl. Math. Sci. 167, Springer, New York, 2009. Suche in Google Scholar

[25] G. Teschke and C. Borries, Accelerated projected steepest descent method for nonlinear inverse problems with sparsity constraints, Inverse Problems 26 (2010), no. 2, Article ID 025007. 10.1088/0266-5611/26/2/025007Suche in Google Scholar

[26] E. van den Berg and M. P. Friedlander, Probing the Pareto frontier for basis pursuit solutions, SIAM J. Sci. Comput. 31 (2008/09), no. 2, 890–912. 10.1137/080714488Suche in Google Scholar

[27] L. Yan, Y. Shin and D. Xiu, Sparse approximation using 1 2 minimization and its application to stochastic collocation, SIAM J. Sci. Comput. 39 (2017), no. 1, A214–5957. 10.1137/15M103947XSuche in Google Scholar

[28] S. Yang, M. Wang, P. Li, L. Jin, B. Wu and L. Jiao, Compressive hyperspectral imaging via sparse tensor and nonlinear compressed sensing, IEEE Trans. Geosci. Remote Sensing 53 (2015), 5943–5957. 10.1109/TGRS.2015.2429146Suche in Google Scholar

[29] P. Yin, Y. Lou, Q. He and J. Xin, Minimization of 1 2 for compressed sensing, SIAM J. Sci. Comput. 37 (2015), no. 1, A536–A563. 10.1137/140952363Suche in Google Scholar

Received: 2023-01-27
Revised: 2023-05-14
Accepted: 2023-06-14
Published Online: 2023-07-25
Published in Print: 2024-06-01

© 2023 Walter de Gruyter GmbH, Berlin/Boston

Heruntergeladen am 24.9.2025 von https://www.degruyterbrill.com/document/doi/10.1515/jiip-2023-0010/html
Button zum nach oben scrollen