Penalized maximum Likelihood image restoration with positivity constraints: multiplicative algorithms
Autor: | Henri Lanteri, Claude Aime, Muriel Roche |
---|---|
Přispěvatelé: | Laboratoire Universitaire d'Astrophysique de Nice (LUAN), Université Nice Sophia Antipolis (... - 2019) (UNS), COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-Institut national des sciences de l'Univers (INSU - CNRS)-Centre National de la Recherche Scientifique (CNRS) |
Jazyk: | angličtina |
Rok vydání: | 2002 |
Předmět: |
Applied Mathematics
Gaussian Regularization perspectives on support vector machines 010103 numerical & computational mathematics Backus–Gilbert method 01 natural sciences Computer Science Applications Theoretical Computer Science Tikhonov regularization symbols.namesake Bounded function [INFO.INFO-IR]Computer Science [cs]/Information Retrieval [cs.IR] 0103 physical sciences Signal Processing symbols Deconvolution 0101 mathematics 010303 astronomy & astrophysics Algorithm Laplace operator Mathematical Physics Image restoration Mathematics |
Zdroj: | Inverse Problems Inverse Problems, IOP Publishing, 2002, 18, pp.1397-1419 |
ISSN: | 0266-5611 1361-6420 |
Popis: | In this paper, we propose a general method to devise maximum likelihood penalized (regularized) algorithms with positivity constraints. Moreover, we explain how to obtain 'product forms' of these algorithms. The algorithmic method is based on Kuhn?Tucker first-order optimality conditions. Its application domain is not restricted to the cases considered in this paper, but it can be applied to any convex objective function with linear constraints. It is specially adapted to the case of objective functions with a bounded domain, which completely encloses the domain of the (linear) constraints. The Poisson noise case typical of this last situation and the Gaussian additive noise case are considered and they are associated with various forms of regularization functions, mainly quadratic and entropy terms. The algorithms are applied to the deconvolution of synthetic images blurred by a realistic point spread function similar to that of Hubble Space Telescope operating in the far-ultraviolet and corrupted by noise. The effect of the relaxation on the convergence speed of the algorithms is analysed. The particular behaviour of the algorithms corresponding to different forms of regularization functions is described. We show that the 'prior' image is a key point in the regularization and that the best results are obtained with Tikhonov regularization with a Laplacian operator. The analysis of the Poisson process and of a Gaussian additive noise leads to similar conclusions. We bring to the fore the close relationship between Tikhonov regularization using derivative operators, and regularization by a distance to a 'default image' introduced by Horne (Horne K 1985 Mon. Not. R. Astron. Soc. 213 129?41). |
Databáze: | OpenAIRE |
Externí odkaz: |