Počet záznamů: 1
Approximate Bayesian recursive estimation
- 1.0425539 - ÚTIA 2015 RIV US eng J - Článek v odborném periodiku
Kárný, Miroslav
Approximate Bayesian recursive estimation.
Information Sciences. Roč. 285, č. 1 (2014), s. 100-111. ISSN 0020-0255. E-ISSN 1872-6291
Grant CEP: GA ČR GA13-13502S
Institucionální podpora: RVO:67985556
Klíčová slova: Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting
Kód oboru RIV: BB - Aplikovaná statistika, operační výzkum
Impakt faktor: 4.038, rok: 2014 ; AIS: 0.873, rok: 2014
Web výsledku:
http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf
DOI: https://doi.org/10.1016/j.ins.2014.01.048
Bayesian learning provides a firm theoretical basis of the design and exploitation of algorithms in data-streams processing (preprocessing, change detection, hypothesis testing, clustering, etc.). Primarily, it relies on a recursive parameter estimation of a firmly bounded complexity. As a rule, it has to approximate the exact posterior probability density (pd), which comprises unreduced information about the estimated parameter. In the recursive treatment of the data stream, the latest approximate pd is usually updated using the treated parametric model and the newest data and then approximated. The fact that approximation errors may accumulate over time course is mostly neglected in the estimator design and, at most, checked ex post. The paper inspects the estimator design with respect to the error accumulation and concludes that a sort of forgetting (pd flattening) is an indispensable part of a reliable approximate recursive estimation.
Trvalý link: http://hdl.handle.net/11104/0231504
Počet záznamů: 1