A deep-network exploration of multi-modal spatial responses in the ferret brain

Lead Research Organisation: UNIVERSITY COLLEGE LONDON
Department Name: Cell and Developmental Biology

Abstract

The brain processes multi-modal sensory data from non-aligned reference frames,
such as retinotopic visual stimuli from the eye and tonotopic auditory stimuli from the
cochlea. These contribute to the spatially constrained firing of hippocampal place cells,
which are though to constitute a world-centred 'cognitive map'. While computational
models have explored visual data transformation into these maps, how auditory
information is processed remains unclear. Observations of world-centred spatial tuning
in auditory cortex suggests fundamental differences between vision and audition. This
project combines deep-network and computational modelling with analysis of
electrophysiological data from ferrets to study audio and audio-visual spatial
representation formation.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
BB/T008709/1 30/09/2020 29/09/2028
2870014 Studentship BB/T008709/1 30/09/2023 29/09/2027 Sihao Liu