Skip to yearly menu bar Skip to main content


Poster

Recurrent Neural Networks and Universal Approximation of Bayesian Filters

Adrian N. Bishop · Edwin V. Bonilla

Auditorium 1 Foyer 96

Abstract:

We consider the Bayesian optimal filtering problem: i.e. estimating some conditional statistics of a latent time-series signal from an observation sequence. Classical approaches often rely on the use of assumed or estimated transition and observation models. Instead, we formulate a generic recurrent neural network framework and seek to learn directly a recursive mapping from observational inputs to the desired estimator statistics. The main focus of this article is the approximation capabilities of this framework. We provide approximation error bounds for filtering in general non-compact domains. We also consider strong time-uniform approximation error bounds that guarantee good long-time performance. We discuss and illustrate a number of practical concerns and implications of these results.

Live content is unavailable. Log in and register to view live content