Stein's method for functions of multivariate normal random vectors: asymptotic expansions, rates of convergence and applications

Lead Research Organisation: University of Manchester
Department Name: Mathematics

Abstract

Stein's method is a powerful technique for bounding the distance between two probability distributions with respect to a probability metric. It was originally developed for normal approximation by Charles Stein in 1972. Since then, Stein's method has been extended to many other distributional limits, including the Poisson, binomial and exponential distributions and has found numerous applications throughout the mathematical sciences.

Recently, a framework has been developed for using Stein's method to prove quantitative limit theorems in which limiting distribution can be expressed as a smooth function of a multivariate normal random vector. That is the prelimit random vector is of the form g(W) and the limit random vector is of the form g(Z), where g is a smooth function and W is a random vector that is well approximated by a multivariate random vector Z. Many of the most probabilistic limit theorems fall into this class, including the central limit theorem and the chi square approximation of Pearson's chi-square statistic. This project will explore the fruitful research directions opened be the development of Stein's method for functions of multivariate normal random vectors.

The first stage of the project will involve making extensions to the classes of functions g to which the existing theory applies: the student will extend the theory to functions vector-valued functions g with derivatives having exponential growth. The student will present several general bounds and asymptotic expansions (with control on the error term) for the distance between the distributions of g(W) and g(Z), with respect to a smooth test function metric, in the case that W is a standardised sum of random vectors. It is expected that faster convergence rates will occur in the case that g is an even function or under certain matching moments assumptions. This work will generalise several significant results from the Stein's method literature.

The general bounds and asymptotic expansions will be used to obtain explicit error bounds and asymptotic expansions in the multivariate delta method, which is widely used in mathematical statistics. This will be the first detailed investigation into rates of convergence in the delta method. In particular, the student will obtain sufficient conditions under which `faster than expected' order n^{-1} (or faster) convergence rates occur.

Later in the project, the student will derive general for the distributional approximation of functions of multivariate normal random vectors under at least on complicated dependence structure. As an important application of this theory, the student will derive explicit error bounds on the distributional approximation of an important statistic by its limiting distribution. Such results would be useful for statisticians and applied researchers who would gain a theoretical justification of various rules-of-thumb used in the implementation of statistical tests, and potentially new insights into the factors governing convergence rates. An example of a statistic that the student may consider are the D_2 and D_2^* statistics which are used in alignment-free sequence comparison.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/T517823/1 01/10/2020 30/09/2025
2481303 Studentship EP/T517823/1 01/10/2020 31/03/2024 Heather Sutcliffe