Novel analytical and datasharing tools for rich neuronal activity datasets obtained with a 4096 electrodes array

Lead Research Organisation: Newcastle University
Department Name: Institute of Neuroscience

Abstract

The functional intricacy of the central nervous system (CNS) arises from the complex anatomical and dynamic interactions between different types of neurones involved in specific networks. Hence, the encoding of information in neural circuits occurs as a result of interactions between individual neurones as well as through the interplay within both microcircuits (made of few neurones) and large scale networks involving thousands to millions of cells. One of the great challenges of neuroscience nowadays is to understand how these neural networks are formed and how they operate. Such challenge can be resolved only through simultaneous recording from thousands of neurones that become active during specific neuronal tasks. One of the experimental approaches to fulfil this goal is to use multielectrode arrays (MEAs) that consist of several channels (electrodes) that can each record (and/or stimulate) from few adjacent neurones within a particular area of the CNS. MEAs can be used in vitro to record from dissociated neuronal cultures or from brain slices or isolated retinas. These MEAs consist of assemblies of electrodes embedded in planar substrates. Typical commercial MEAs consist of 60-128 electrodes with a spacing of 100-200 um. Considering that a generic neurone in the mammalian CNS has a diameter of about 10 um, it is obvious that such MEAs cannot convey information on the activity of all neurones involved in a specific network, but rather just from a sample of these cells. To overcome this activity under-sampling, in this project, we will use the Active Pixel Sensor (APS) MEA, a novel type of MEA platform developed in a NEST-EU Project by our collaborator Luca Berdondini (Italian Institute of Technology, Genova). This MEA consists of 4,096 electrodes with near cellular resolution (21x21 um, 42 um centre-to-centre separation, covering an active area of 2.5 mm x 2.5 mm), where recording is possible from all channels at the same time. We will use the APS MEA to record spontaneous waves of activity that are present in the neonatal vertebrate retina. These waves occur during a short period of development during perinatal weeks and they are known to play an important role in guiding the precise wiring of neural connections in the visual system, both at the retinal and extra-retinal levels. The APS-MEA, thanks to its unmet size and resolution, will enable us to reach new insights into the precise dynamics of these waves as never achieved before. Recordings from such large scale networks at near cellular resolution generate extremely rich datasets with the drawback that these datasets are very large and difficult to handle, thus necessitating the development of new powerful analytical tools enabling to decode in a fast, efficient and user-friendly way how cellular elements interact in the network. The development of such computational tools is the central goal of this project, while the experimental work on the retina defines a challenging and unique scientific context. The tools we plan to develop will yield parameters that will help us reach better understanding of network function, from the temporal firing patterns of individual neurones to how activity precisely propagates within the network. We will also develop novel tools for easier visualisation of the dynamical behaviour of the activity within the network. These tools will be developed in a language that could be easily utilized by other investigators using the same recording system or other platforms of their choice. Finally, to ensure that these tools are accessible to the wide neurophysiology community, they will be deployed on CARMEN (Code Analysis, Repository and Modelling for e-Neuroscience), a new internet-based neurophysiology sharing resource designed for facilitating worldwide communication between collaborating neurophysiologists.

Technical Summary

publicly available if the proposal is funded. [up to 2000 characters] The complexity of neuronal communication arises from the exquisite precision of anatomical and functional connectivity within neuronal assemblies. To understand how neural connectivity is formed and operates, it is crucial to record simultaneously at high spatiotemporal precision from large scale neuronal networks. Multielectrode array (MEA) recordings have become one of the best experimental approaches for this purpose. Although MEAs offer excellent temporal resolution, their spatial resolution is poor, with typical commercial MEAs consisting of 60-128 electrodes with 30 um diameter and 200 um spacing, which is insufficient to study fine-grain spatiotemporal cellular interactions. In this project we will use the novel APS MEA platform developed by L. Berdondini and collaborators. The APS MEA is unique in terms of spatiotemporal resolution. It consists of 4,096 channels with near cellular resolution (21 um electrode diameter and separation) that can record simultaneously at a full frame rate of 7.8 kHz, which is high enough to reliably discriminate single spikes. We will use the APS MEA to record neonatal mouse retinal waves. Retinal waves undergo substantial changes in their spatiotemporal properties as the retina develops and the APS MEA will enable us to investigate these properties with a precision never been achieved before. The generation of such large and rich datasets necessitates the development of new powerful computational analytical tools, and this will be the central goal of this project. We will develop user-friendly new statistical approaches to decode large neuronal networks and new computational and visualization tools to quantify fine-grain spatiotemporal properties in neural networks. To allow community-wide access to these novel tools, they will be deployed on CARMEN, a new UK-based neurophysiology code development and data sharing facility developed in the past 3 years.

Planned Impact

Although our project will be the first one using the APS MEA to study an intact neural network (e.g. the retina) rather than interactions between dissociated neurones, we have no doubt that the APS MEA (or other similar developments) will soon become sought after by many neuroscientists seeking deeper understanding of precise interactions within large neuronal assemblies. From that point of view, this project will bring a strong proof of concept for the development and use of large scale MEAs. An entire session on MEAs at the last Society for Neuroscience meeting in Chicago (October 17-21 2009, ~30,000 participants) has revealed the fast growing interest of the neuroscience community in these large arrays that are becoming increasingly sophisticated thanks to new developments in nanotechnology and microfabrication. At the same time it is obvious that the APS MEA is the best performing platform available nowadays (in terms of the number of channels that can be used at any single time at high acquisition rate). The system was highly praised at the Chicago meeting, and we are in the very fortunate position of being at the forefront of research and development of the APS MEA through our collaboration with Luca Berdondini. We believe that we will be able to generate data of superior quality that will generate strong interest amongst neurophysiologists and computational neuroscientists. Because recordings with the APS MEA generate data that has no precedents in terms of spatiotemporal resolution, our results will undoubtedly shed new light on how to analyse, visualise and quantify neural network function, and this will be of great interest for the development of new software resources that will be used by systems neuroscientists. Because our experimental data and new analytical tools will be deployed on an open access data sharing facility, the impact of our research will be of much wider extent and will facilitate the dissemination of important scientific knowledge. All aspects of our research fall within the remit of research supported by the BBSRC, with special emphasis on the Tools and Resources Development Fund.
 
Title Retinal wave picture 
Description I have created a composition of retinal waves recorded with the 4096 electrodes array system, inspired from Any Warhol This work has been selected to be showcased in the new entrance lobby of the Institute of Neuroscience in Newcastle University 
Type Of Art Artwork 
Year Produced 2017 
Impact The artwork has only just been hung in the lobby, it is too early to measure impact 
 
Description A large-scale, high density multielectrode array platform recording neural activity simultaneously with 4,096 electrodes was successfully installed at Newcastle, and the recordings were analysed at Edinburgh.
We found the array to be an excellent system for recording spontaneous neuronal activity from the neonatal retina (retinal waves) at a near cellular resolution.

The grant enabled us, as planned, to develop a suite of tools for the analysis of these large scale MEA recordings. In particular, we devised a new improved detection method for neural spikes. This was necessary as we found that conventional methods were not suitable for these systems
because of a very different noise profile caused by the densely integrated electronic circuits. Furthermore, we developed several methods for quantitative analysis of spatio-temporal activity patterns observed in the developing retina.

The analysis of retinal waves with these methods revealed several previously unknown features. Most strikingly, we found that late stage waves appear in spatially constrained hot-spots, which move over the course of hours (in vitro). Our analysis also enabled us to publish the most comprehensive characterisation of their ontogeny during the
first two postnatal weeks in mice available in the literature.

The complete data set generated during the project is available for download for further analysis.
Exploitation Route We have pioneered a powerful approach to investigate neural networks with unprecedented detail
Sectors Healthcare,Pharmaceuticals and Medical Biotechnology,Other

 
Description A large-scale, high density multielectrode array platform recording neural activity simultaneously with 4,096 electrodes was successfully installed at Newcastle, and the recordings were analysed at Edinburgh. We found the array to be an excellent system for recording spontaneous neuronal activity from the neonatal retina (retinal waves) at a near cellular resolution. The grant enabled us, as planned, to develop a suite of tools for the analysis of these large scale MEA recordings. In particular, we devised a new improved detection method for neural spikes. This was necessary as we found that conventional methods were not suitable for these systems because of a very different noise profile caused by the densely integrated electronic circuits. Furthermore, we developed several methods for quantitative analysis of spatio-temporal activity patterns observed in the developing retina. The analysis of retinal waves with these methods revealed several previously unknown features. Most strikingly, we found that late stage waves appear in spatially constrained hot-spots, which move over the course of hours (in vitro). Our analysis also enabled us to publish the most comprehensive characterisation of their ontogeny during the first two postnatal weeks in mice available in the literature. The complete data set generated during the project is available for download for further analysis.
Sector Digital/Communication/Information Technologies (including Software),Education,Other
Impact Types Societal,Economic

 
Description Crack It Challenge
Amount £1,100,000 (GBP)
Organisation National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs) 
Sector Private
Country United Kingdom
Start 01/2017 
End 10/2021
 
Description FP7 FET NBIS
Amount € 2,200,000 (EUR)
Funding ID 600847 
Organisation European Commission 
Sector Public
Country European Union (EU)
Start 03/2013 
End 02/2016
 
Description MICA: An iPSC based screen for candidate pain modulating compounds
Amount £571,447 (GBP)
Funding ID MR/R011338/1 
Organisation Medical Research Council (MRC) 
Sector Academic/University
Country United Kingdom
Start 04/2018 
End 03/2021
 
Description Project Grant
Amount £274,437 (GBP)
Funding ID RPG-2016-315 
Organisation The Leverhulme Trust 
Sector Academic/University
Country United Kingdom
Start 04/2017 
End 09/2020
 
Title New method for spike sorting 
Description We recently developed a new powerful analytical tool to isolate signals originating from different neurones recorded with the high density system used in the original project 
Type Of Material Physiological assessment or outcome measure 
Year Produced 2017 
Provided To Others? Yes  
Impact It is too early to know how it will impact other researchers (our first paper came out this week, in Cell Reports) For our research group, this new tool has vastly improved the accuracy and quality of our data analysis 
URL https://github.com/martinosorb/herding-spikes
 
Title new method for cell classification in the retina 
Description This new method enables classification of retinal ganglion cells into different functional classes without having to use very specific visual stimuli. It is simply based on the fact that cells belonging to the same class will tend to fire with the same firing patterns (distances between spikes in spikes trains in pairs of cells). The method can easily be applied to different sensory modalities. 
Type Of Material Physiological assessment or outcome measure 
Year Produced 2018 
Provided To Others? Yes  
Impact It is a bit too early to say, it was published in December 2018 It is drawing a lot of interest (from looking at paper metrics on the Frontiers in Neuroscience website). 
 
Title retinal waves data repository Gigascience 
Description database of retinal wave recordings from our laboratory, stored on the CARMEN repository for public sharing 
Type Of Material Database/Collection of data 
Year Produced 2014 
Provided To Others? Yes  
Impact Several computational scientists have accessed the data and used in their models/simulations 
 
Description APS recordings collaborations 
Organisation Italian Institute of Technology (Istituto Italiano di Tecnologia IIT)
Country Italy 
Sector Public 
PI Contribution we developed the use of the high density array system to record from the retina
Collaborator Contribution They contributed the technology, the high density system as well as some analytical softwares At INRIA they help us with the analysis of the very complex data generated with the system
Impact multidisciplinary: neuroscience, electronic and software engineering outcomes: one paper several conference proceedings new funding acquisition of the hardware at great discount 2 more papers in review process
Start Year 2012
 
Description APS recordings collaborations 
Organisation The National Institute for Research in Computer Science and Control (INRIA)
Country France 
Sector Public 
PI Contribution we developed the use of the high density array system to record from the retina
Collaborator Contribution They contributed the technology, the high density system as well as some analytical softwares At INRIA they help us with the analysis of the very complex data generated with the system
Impact multidisciplinary: neuroscience, electronic and software engineering outcomes: one paper several conference proceedings new funding acquisition of the hardware at great discount 2 more papers in review process
Start Year 2012
 
Description Giving support (talking to potential customers at company booth at conferences, providing example recordings for website) to 3Brain, the manufacturer of the recording device used in this project 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact 3Brain is the pioneer in high density multielectrode array technology, and I was the first one to acquire their system commercially. I have since developed a very close relationship with them and regularly assist them at conference, talking to potential customers, from my biological, rather than technological perspective. I also provided 3Brain with very high quality recording examples which they showcase on their website.
Year(s) Of Engagement Activity 2011,2012,2013,2014,2015
URL http://www.3brain.com/home
 
Description Presenting the CARMEN project at various scientific/social gatherings 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact The CARMEN project has been presented at special interest events, mostly as part of satellite activities at conferences
Year(s) Of Engagement Activity 2014,2015,2016