Convergence analysis of gradient descend methods generated by two different functionals in a backward heat conduction problem
-
and
Abstract
A backward heat conduction problem is reduced to an operator equation Aq = ƒ and is solved in two versions by minimization of functionals R(q) = 〈Aq – 2ƒ, q〉 and J(q) = 〈Aq – ƒ, Aq – ƒ〉. The performance of the versions is compared both theoretically and numerically. An a priori convergence rate estimate has been obtained in the case of exact data and an a posteriori two-side estimate of the solution error norm has been derived in the case of noisy data. The estimates are illustrated by a model image deblurring problem. Numerical experiments demonstrated that minimization of J(q) is more effective for noisy data while R(q) works better for exact data.
© de Gruyter 2009
Articles in the same Issue
- Inverse spectral problems for Sturm–Liouville differential operators on a finite interval
- Analytic representations of solutions to inverse problems for nonlinear equations
- An ill-posed boundary value problem for the Helmholtz equation on Lipschitz domains
- Convergence analysis of gradient descend methods generated by two different functionals in a backward heat conduction problem
Articles in the same Issue
- Inverse spectral problems for Sturm–Liouville differential operators on a finite interval
- Analytic representations of solutions to inverse problems for nonlinear equations
- An ill-posed boundary value problem for the Helmholtz equation on Lipschitz domains
- Convergence analysis of gradient descend methods generated by two different functionals in a backward heat conduction problem