Abstract
We introduce a framework for inference in general state-space hidden Markov models (HMMs) under likelihood misspecification. In particular, we leverage the loss-theoretic perspective of generalized Bayesian inference (GBI) to define generalized filtering recursions in HMMs, that can tackle the problem of inference under model misspecification. In doing so, we arrive at principled procedures for robust inference against observation contamination through the
Citation information
A. Boustati, Ö. D. Akylidìz, T. Damoulas, and A. M. Johansen (2020). Generalized Bayesian filtering via sequential Monte Carlo. In Advances in Neural Information Processing Systems 33 (NeurIPS). To appear.