Inferring the evolution of functional connectivity over learning in large-scale neural recordings using low-tensor-rank recurrent neural networks

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Informatics

Abstract

Humans and other animals can learn to perform complex and adaptive behaviours based on limited experience. Understanding the neural basis of learning is a key challenge in systems neuroscience and artificial intelligence (AI) that could lead to novel treatments for neurological disorders and enable the development of AI systems that learn with human-like efficiency. Thus, significant effort and funding is currently being invested to understand how neural circuits reorganise during learning to improve performance in various cognitive, perceptual, and motor tasks, both in academic research organisations and private companies such as Google DeepMind.
Recent advances in neural recording technologies enable the activity of thousands of neurons to be tracked simultaneously at millisecond precision, and stably over days, so that neural activity can be surveyed over the entire course of learning. Through careful analysis of these recordings, scientists hope to determine how changes in the underlying neural circuit support improvements in task performance. In particular, learning is thought to modify the strength of connections between neurons, which leaves a functional signature that can be detected via the coordinated activity of interconnected groups of neurons. However, during learning, many other changes also take place, including changes in motor behaviour, attention, and sensory input, all of which may influence the activity of the recorded neurons. Thus, a key challenge is to disentangle the learning-related changes in recorded neural activity from those arising from sensory, motor, and internal state variables which covary with learning. The proposed project will develop novel methodologies for analysis of large-scale neural recordings to address this need.

Publications

10 25 50