Information geometric theory of neural information processing and disorder

Lead Research Organisation: Coventry University
Department Name: Ctr for Fluid and Complex Systems

Abstract

Information processing is shared among seemingly different nonlinear complex dynamics. In particular, there has been a growing interest in information geometry that refers to the application of differential geometry to probability and statistics by defining the notion of metric (distance) in statistical manifolds. In particular, it provides us with a powerful method for understanding random stochastic processes for theoretical and practical purposes. Conceptually, assigning a metric to probability density functions (PDFs) enables us to quantify the difference among different PDFs and thus to make a beautiful link between a stochastic process, complexity, and geometry. This project aims to develop a new model-free information geometric theory of neural information processing for the practical purpose of improved disorder diagnosis by overcoming various current challenges described below.

Brains are complex information-processing organs whose proper function in different parts is indispensable for our optimal well-being. Many critical issues, such as understanding neural information processing and diagnosis of neurological disorders, require the identification of not only regional activation, but also the causal connectivity among different regions of the brain and the simplest possible circuit to explain observed responses. In particular, causality (effective connectivity) analysis offers new diagnostic opportunities for a whole range of neurological disorders. Therefore, the study of classical structural connectivity is complemented by functional connectivity and, crucially, causal analysis through statistical modelling of neurophysiological signals (e.g., such as functional magnetic resonance imaging and electroencephalography (EEG)).

The main challenges in neurological signal analysis stem from the uncertain time-varying nonlinear dynamics of the human brain. Data are generally non-stationary and non-Gaussian, while the mean value, variance, or other higher moments can abruptly change in time. Such data cannot be adequately quantified in the traditional formulation of transfer entropy and Granger causality based on stationary or Gaussian data. Furthermore, underlying mathematical models are not always available to fit the data. On the other hand, the reduced signal-to-noise ratio in data often hampers an accurate analysis. It is thus critical to develop a model-free method that can effectively quantify dynamic changes in data.

To face these challenges, we will take our leading-edge research on information geometry as our starting point and develop the method further to quantify non-stationary time-varying effects, nonlinearity, and non-Gaussian stochasticity most effectively. To this end, we propose a one-year, synergistic program on theoretical and computational studies and data analysis by harnessing the complementary skills of our team. Specifically, we will: i) extend our theory to nonlinear/multiple variables; ii) numerically simulate simple neural activity models. In parallel, we will: iii) apply our new methods to analyse the anonymised EEG data from healthy control groups and patients with certain neurological disorders (e.g., epilepsy, Parkinson's disease with normal cognitive function, Alzheimer's disease). In particular, we will compare information processing and brain connectivities among key regions of the brain in healthy control groups and patients and identify their similarities and differences. We will then develop biomarkers to diagnose neurological disorders, such as seizures, and track disease progression while exploring clinical implications. This project will be a stepping stone for future proposals to address other practical challenges given the increasingly important role of information theory across disciplines.
 
Description Main challenges in neurological signal analysis stems from the uncertainty in time-varying nonlinear dynamics of the brain. Data from the brain are non-stationary and non-Gaussian distributed in general while mean, variance or other higher moments change in time. To be able to deal with such data, we developed time-dependent probability functions and information geometry to identify interesting differences in neural information processing between control and patient groups. Specifically, from electroencephalography (EEG) signals simulated for both healthy subjects and Alzheimer's disease (AD) patients with both eyes-closed and eyes-open conditions, we calculated information rates and causal information rates to quantify the time evolution of PDFs of EEG signals and one signal's instantaneous influence on another signal's information rate, respectively. The results revealed significant differences between healthy subjects and AD patients. These distinctions can be related to differences in neural information processing activities of the corresponding brain regions, and to differences in connectivities among these brain regions. We also found that information rate and causal information rate are superior to their more traditional or established information-theoretic counterparts, i.e., differential entropy and transfer entropy, respectively
Exploitation Route Since our novel, information geometry theoretic measures are model-free, it can be applied to different EEG signals in a model-free manner and will form an important and powerful tool-set for both understanding neural information processing in the brain and the diagnosis of different neurological disorders. I will promote clinical research and practice in complex brain processes, ranging from sensory integration to motor planning and control and including all various components of cognitive functioning.
Furthermore, since information processing is shared among seemingly different nonlinear complex dynamics, our new interdisciplinary method can be utilised in a broad spectrum of EPSRC portfolios (e.g. Mathematical biology, Mathematical physics, Non-linear systems, Statistics, Complexity science, Fluid dynamics in Mathematical Sciences) as well as other disciplines where information theory plays an increasingly important role. In particular, it will help analyse emerging big data with the potential to help data-driven technologies. Also, since causality plays a crucial role in predicting, diagnosing, or controlling harmful events (diseases, extreme weather) in physics, neuroscience, biology, environments, economics, etc., it will stimulate future research which can address key UK societal challenges and economic success.
Sectors Digital/Communication/Information Technologies (including Software)

Healthcare

Other

 
Title Non-perturbative, time varying statistical methods, information geometry 
Description Information geometry refers to the application of differential geometry to probability and statistics by defining the notion of metric (distance) in statistical manifolds. In particular, it provides us with a powerful method for understanding random stochastic processes for theoretical and practical purposes. Conceptually, assigning a metric to probability density functions (PDFs) enables us to quantify the difference among different PDFs and thus to make a beautiful link between a stochastic process, complexity, and geometry. We developed a new model-free information geometric theory of neural information processing for the practical purpose of improved disorder diagnosis by overcoming various current challenges. 
Type Of Material Improvements to research infrastructure 
Year Produced 2023 
Provided To Others? Yes  
Impact Elucidate complex brain processes, ranging from sensory integration to motor planning and control and including all various components of cognitive functioning. In particular, causality (effective connectivity) technique will contribute to the development of future research tools in clinic practice to understand various elusive brain network functions. Benefit researchers working in laboratory and astrophysical fluids/plasmas (e.g. magnetically confined fusion plasmas) where causality quantification is crucial for understanding self-organisation, explosive events, and other important observed phenomena. Provide a new interdisciplinary method to be utilised in a broad spectrum of EPSRC portfolios (e.g., Mathematical biology, Mathematical physics, Non-linear systems, Statistics, Complexity science, Fluid dynamics in Mathematical Sciences) as well as other disciplines where information theory plays an increasingly important role. Beenefit other researchers working on stochastic processes, mathematical modelling, non-equilibrium phenomena, intermittency, time-series analysis, information theory, data analysis, statistical analysis, and turbulence in diverse disciplines including physics, biomedical science, economics, engineering, and social science. The results of the project will be effectively disseminated by being published in scientific journals and presented at scientific conferences. 
 
Description Engaging with academics and industry by a presentation 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Other audiences
Results and Impact We disseminated the latest outcome of our research at the IEEE International Conference on Bioinformatics and Biomedicine (BIBM), which is the premier research conference in bioinformatics and biomedicine. IEEE BIBM a leading forum for disseminating the latest research in bioinformatics and health informatics. It brings together academic and industrial scientists from computer science, biology, chemistry, medicine, mathematics and statistics.
Year(s) Of Engagement Activity 2023