Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Vortrag

New Projected Quasi-Newton Methods with Applications

MPG-Autoren
/persons/resource/persons76142

Sra,  S
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Sra, S. (2008). New Projected Quasi-Newton Methods with Applications. Talk presented at Microsoft Research Tech-talk. Redmond, WA, USA. 2008-12.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-C645-C
Zusammenfassung
Box-constrained convex optimization problems are central to several
applications in a variety of fields such as statistics, psychometrics,
signal processing, medical imaging, and machine learning. Two fundamental
examples are the non-negative least squares (NNLS) problem and the
non-negative Kullback-Leibler (NNKL) divergence minimization problem. The
non-negativity constraints are usually based on an underlying physical
restriction, for e.g., when dealing with applications in astronomy,
tomography, statistical estimation, or image restoration, the underlying
parameters represent physical quantities such as concentration, weight,
intensity, or frequency counts and are therefore only interpretable with
non-negative values. Several modern optimization methods can be
inefficient for simple problems
such as NNLS and NNKL as they are really designed to handle far more
general and complex problems.
In this work we develop two simple quasi-Newton methods for solving
box-constrained
(differentiable) convex optimization problems that utilize the well-known
BFGS and limited memory BFGS updates. We position our method between
projected gradient (Rosen, 1960) and projected Newton (Bertsekas, 1982)
methods, and prove its convergence under a simple Armijo step-size rule. We
illustrate our method by showing applications to: Image deblurring, Positron
Emission Tomography (PET) image reconstruction, and Non-negative Matrix
Approximation (NMA). On medium sized data we observe performance competitive
to established procedures, while for larger data the results are even
better.