optimal forecasting of time series

Lead Research Organisation: Imperial College London
Department Name: Electrical and Electronic Engineering


Efficient time series forecasting is crucial in a large variety of applications. There is a plethora of examples where sequential data with temporal changes are represented as a function of time. From the progression of a disease or weather forecasting to the price of a certain stock and the dynamics of a social network, time series are virtually ubiquitous in society and in nature. Being able to predict the evolution of such quantities can be crucial in developing appropriate techniques to deal with future outcomes.

Recurrent Neural Networks (RNNs) is a very important class of neural networks that allows processing of sequential data. They are connectionist models that capture the dynamics of sequences via cycles in the network of nodes. Unlike standard feedforward neural networks, recurrent networks retain a state that can represent information from an arbitrarily long context window. This way, they are capable of learning temporal dependencies between sequential data and thus constitute an efficient way to perform various analyses involving such data.

Even though RNNs have proved to be successful in many different areas, they suffer from certain limitations. Indeed, when such networks are required to be applied to very high-dimensional inputs, they are often quite difficult and computationally arduous to train. This is mainly due to the fact that a huge weight matrix containing the weights between the input and the hidden layers has to be stored and even millions of parameters have to be learnt. As a result, RNNs' applicability is reduced to mainly high-performance systems with very powerful Central Processing Units (CPUs) and notably Graphics Processing Units (GPUs), thus making their use very inefficient when deployed in everyday computers.

Recently, the use of tensors was proposed to overcome such limitations in fully connected neural network architectures. Tensors may be represented as multidimensional arrays. Then, various tensor decompositions can be applied that can offer significant advantages. In the case of RNNs, representing the input to hidden weight matrix as a high dimensional tensor and then factorize this tensor using tensor-train decomposition can be of great practical value. Such an approach compresses tremendously the size of the network by reducing the number of weights to be used in the network. As a result, the resulting tensor-trained RNN (TT-RNN) not only vastly enhances the applicability of RNNs to commodity hardware, but also is expected to require less training data since much fewer parameters have to be estimated.

Inspired by such approaches and my strong interest for time series analysis, I plan to investigate the use of TT-RNN for time series forecasting. Traditional forecasting approaches involve the use of autoregressive or similar models which are able to capture only linear dynamics and interactions between time series. Even though linear dynamics are convenient to work with, many processes that arise in nature and society are highly non-linear and demand more sophisticated models to accurately analyze them.

Ideally, a model that attempts to forecast the temporal evolution of a time series should take into account its past as well as the behavior of other series that may directly or indirectly interact with it. I believe that the computational advantage provided by the tensor approach, coupled with the ability of RNNs to capture process dynamics and inter-process dependencies, can produce state-of-the-art long-term forecasting performances on very high dimensional multivariate non-linear time series. Thus, the aim of my PhD project is to investigate in depth this combination of techniques in order to produce highly efficient computational methods that will yield optimal prediction accuracies for various types of time series. The project aligns with the Information and Communication Technologies theme and with the Artificial Intelligence Technologies research area.


10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/R513052/1 01/10/2018 30/09/2023
2283842 Studentship EP/R513052/1 01/10/2019 31/03/2023 Kriton Konstantinidis