Unconventional Monetary Policies in the UK, estimating their impact using shrinkage and persistent volatility

Lead Research Organisation: King's College London
Department Name: School of Management and Business


Two main strands in econometric analysis literature for macro-econometrics and finance are, i) structural change and volatility modelling and ii) high-dimensional regressions. For these two strands we propose a framework that has the potential to create a number of contributions. Below, we briefly describe the setup and goals of the research idea and illustrates some potential empirical applications where our methodology can be applied.

First, empirical work has concluded that variation in volatility is very persistent, a feature illustrated in estimated parameters that lie close to the boundary for stationarity. This implies that volatility can be characterised by persistent and possibly non-stationary processes. In this content we propose the use of a kernel volatility estimator (KVE) that has the potential to adequately fit the observed behaviour of volatility. Our estimator requires a small set of assumptions and can disentangle persistent types of volatilities and lower type such as ARCH and SV. To illustrate this ability, Monte Carlo (MC) simulations are required. The potential gains of this framework are first, that we do not impose linear types of volatility and second, that parameter estimates can be more precise if our estimator is able to correctly estimate the persistent volatility that can be the major driver in the data.

Second, the growing availability of large datasets in the last ten years can potentially assist econometricians under the assumption that they carry rich and relevant information. Their size creates an "ill-posed" problem where even basic methods as ordinary regression cannot work. The literature has produced many ways to tackle this problem that can be separated in the sparse regression and dimensionality reduction framework respectively. Its main aim is first, reduce the dimensions of the regressors by producing factors that carry the maximal informative content, in terms of correlations and further to exclude regressors that have minimal information for inference. The main avenue of doing this is to expand the classical least squares estimator minimisation problem with penalties that induce shrinkage and dimensionality reduction, commonly on the first and second norms. The most well known estimators are the Lasso, Ridge regression, Sparse Partial Least Squares and the elastic net. In this content we proposed a different estimator that generalises the above. Specifically, we do not impose a specific number of norm-penalties but instead we envision an estimator that includes a vast number of norm-penalties, whose performance will be assessed by Cross-Validation and out of sample forecasting. The benefit of this procedures is that while agnostic to the number of norm-penalties to include a-priori, we produce a penalisation scheme that can potentially work in a different way for each dataset and can yield better results. Assessing the performance of this estimator requires MC with synthetic datasets.

In terms of the empirical applications our estimators are natural candidates to examine first the volatility that exists in stock indexes and whether parameter estimates obtained from the KVE procedure are better in forecasting. Further both the KVE as well as the shrinkage estimator can help us to potentially examine the effects unconventional monetary policies, as employed by the Bank of England, had in the real economy and whether the effect deteriorated after their initial employment.


10 25 50
publication icon
Chronopoulos I (2021) Kernel-based Volatility Generalised Least Squares in Econometrics and Statistics

Studentship Projects

Project Reference Relationship Related To Start End Student Name
ES/P000703/1 30/09/2017 29/09/2027
1916649 Studentship ES/P000703/1 30/09/2017 31/12/2020 Ilias Christos Chronopoulos
Description Through this award we have created new econometric/statistical estimation frameworks that can provide better inference, when the underlying processes are
persistent time varying and potentially non stationary. This is in line with the key empirical finding that macroeconomic and financial datasets are extremely persistent.
We have shown that inference with our estimators is more robust and efficient. We have provided both new robust estimators and a testing scheme. Our new econometric methodologies are general and not only constrained in examining the effects of unconventional monetary policies.
Exploitation Route Our hope is that our estimation frameworks can be integrated in the statistical/econometric estimation toolbox of third parties, including the ones of central banks.
Sectors Financial Services, and Management Consultancy