Information geometric theory of neural information processing and disorder

Lead Research Organisation: Coventry University
Department Name: Ctr for Fluid and Complex Systems

Abstract

Information processing is shared among seemingly different nonlinear complex dynamics. In particular, there has been a growing interest in information geometry that refers to the application of differential geometry to probability and statistics by defining the notion of metric (distance) in statistical manifolds. In particular, it provides us with a powerful method for understanding random stochastic processes for theoretical and practical purposes. Conceptually, assigning a metric to probability density functions (PDFs) enables us to quantify the difference among different PDFs and thus to make a beautiful link between a stochastic process, complexity, and geometry. This project aims to develop a new model-free information geometric theory of neural information processing for the practical purpose of improved disorder diagnosis by overcoming various current challenges described below.

Brains are complex information-processing organs whose proper function in different parts is indispensable for our optimal well-being. Many critical issues, such as understanding neural information processing and diagnosis of neurological disorders, require the identification of not only regional activation, but also the causal connectivity among different regions of the brain and the simplest possible circuit to explain observed responses. In particular, causality (effective connectivity) analysis offers new diagnostic opportunities for a whole range of neurological disorders. Therefore, the study of classical structural connectivity is complemented by functional connectivity and, crucially, causal analysis through statistical modelling of neurophysiological signals (e.g., such as functional magnetic resonance imaging and electroencephalography (EEG)).

The main challenges in neurological signal analysis stem from the uncertain time-varying nonlinear dynamics of the human brain. Data are generally non-stationary and non-Gaussian, while the mean value, variance, or other higher moments can abruptly change in time. Such data cannot be adequately quantified in the traditional formulation of transfer entropy and Granger causality based on stationary or Gaussian data. Furthermore, underlying mathematical models are not always available to fit the data. On the other hand, the reduced signal-to-noise ratio in data often hampers an accurate analysis. It is thus critical to develop a model-free method that can effectively quantify dynamic changes in data.

To face these challenges, we will take our leading-edge research on information geometry as our starting point and develop the method further to quantify non-stationary time-varying effects, nonlinearity, and non-Gaussian stochasticity most effectively. To this end, we propose a one-year, synergistic program on theoretical and computational studies and data analysis by harnessing the complementary skills of our team. Specifically, we will: i) extend our theory to nonlinear/multiple variables; ii) numerically simulate simple neural activity models. In parallel, we will: iii) apply our new methods to analyse the anonymised EEG data from healthy control groups and patients with certain neurological disorders (e.g., epilepsy, Parkinson's disease with normal cognitive function, Alzheimer's disease). In particular, we will compare information processing and brain connectivities among key regions of the brain in healthy control groups and patients and identify their similarities and differences. We will then develop biomarkers to diagnose neurological disorders, such as seizures, and track disease progression while exploring clinical implications. This project will be a stepping stone for future proposals to address other practical challenges given the increasingly important role of information theory across disciplines.