How local particle filters can be used to solve filtering and smoothing problems in Hidden Markov Models

Lead Research Organisation: University of Bristol
Department Name: Mathematics

Abstract

Abstract
(no more than 4,000 characters inc spaces) Hidden Markov Models (HMM) can be applied widely in real life, from traffic models to different biological and neural models. This fitting with reality has increased the demand of precise and efficient algorithms to solve the filtering and the smoothing problems. The literature offers different methods to work out the filtering and the smoothing distribution of the underlying model, though when the dimension increases two problems are revealed. Firstly, as outspread in the paper by Rebeschini, Van Handel et al. (Can local particle filters beat the curse of dimensionality?, 2015), the particle filter (a well-know algorithm that estimates the filtering distribution) is affected by the curse of dimensionality. Secondly, even if the quantities of interest can be calculated in a close form the growth in the dimension could make the computational cost unfeasible. As explained in detail by Rebeschini, Van Handel et al. (2015), it is possible to prevent this problem by developing local particle filters with a dimension-free error, which has also the advantage of being computationally cheaper.

The starting point of the research is the HMM where the hidden Markov chain is an n-dimensional sequence of 0's and 1's. Given that the state space is the product space of {0, 1} (n-times) , it can be easily recognized that the dimension is 2 to the power of n, meaning that the curse of dimensionality can be an issue. Since the state space is finite, the filtering and the smoothing distributions are available after forward and a backward step. However these operations involve handling matrices with 2 to the n rows and 2 to the n columns that are too expensive from a computational point of view. A possible work around to this is to introduce an error to increase the speed of the algorithm. This approximation can be found by assuming the existence of a local structure inside the model and considering a particular factorization of the observation distribution. The first thing is assuming that the Markov process X, for a fixed time, admits a local structure, meaning that each component is somehow caused by only a set of neighbors. The second assumption is that the process Y for a fixed time is drawn from a distribution G that admits a nondegenerative representation, meaning that exists a positive observation density g which factorizes. The last assumption is the factorization of the initial measure. These assumptions allow to redefine the forward step as working only on matrices with reduced dimension and with a low approximation error. Having ensured that the corresponding implementation works properly, the next aim will be to modify the EM algorithm to also include the cases in which all the parameters of the model (initial distribution, transition kernel, etc.) are unknown. This specific case has an application to traffic models. Indeed, in the temporal evolution of a network of roads each edge can be modeled by a component with a value that can be either congested or not congested. Of course this state of a road is not achievable, therefore only the number of cars for each road is observed. In this specific case the local structure is a reasonable assumption, because the traffic in a road is influenced only by the closest ones. The next step would be to apply the algorithm to this freeway traffic model, using data from the real world.

As natural continuation of the research, it will be interesting to study generalization of this algorithm. So trying to apply the approximation not only to the "ideal" finite state space, but also to more general spaces. If that is possible, the method can be adapted to a huge range of applications.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/N509619/1 01/10/2016 30/09/2021
1961576 Studentship EP/N509619/1 18/09/2017 31/03/2021 Lorenzo Rimella