📣 Help Shape the Future of UKRI's Gateway to Research (GtR)

We're improving UKRI's Gateway to Research and are seeking your input! If you would be interested in being interviewed about the improvements we're making and to have your say about how we can make GtR more user-friendly, impactful, and effective for the Research and Innovation community, please email gateway@ukri.org.

Non-canonical binocular pathways in human vision

Lead Research Organisation: University of York
Department Name: Psychology

Abstract

The human brain combines information from many sources, including across our two forward-facing eyes with overlapping visual fields. This results in a single image of the external world, and provides us with stereo (3D) vision. But the brain also combines information across the eyes for several other reasons. One example is the response of our pupils to changes in light levels - the brain must use light levels from both eyes to decide how much to dilate or constrict the pupils. This happens in a completely different network of brain regions to those involved in perception. Separately, studies in animals have found a direct pathway to motion-sensitive regions of the brain, that may govern automatic eye movements to fast moving objects. Although we know about the anatomy of these other binocular pathways, we understand much less about precisely what they are doing. This project aims to understand the computations involved, and to compare them to those in the perceptual pathway, which are better understood, and for which mathematical models already exist.

To achieve this, we will perform experiments in human volunteers. Our first study will focus on the pathway that governs pupil diameter. We can measure how the pupils constrict and dilate in response to flickering lights by using an eye-tracker. These responses can be compared to the responses of the perceptual parts of the brain, which we measure at the same time using EEG (a technique that records electrical brain activity at the scalp). Our visual system involves multiple 'colour channels', that are fed by cells in the eye that respond to different wavelengths of light. Some of these appear to be more important than others for deciding the pupil response, so we will compare the characteristics of binocular combination across the different channels. We will also look at whether the channels interact across the eyes, e.g. if we show light of different wavelengths to the left and right eyes, how does activity in the different colour channels interact?

Next, we will investigate the operation of the pathway that governs automatic eye movements to moving objects. The anatomical pathways involved may be directly activated by specific wavelengths of light (blue coloured light, that activates the S-cone pathway). By measuring eye movements in response to fast moving stimuli presented to one or both eyes, targeted towards a particular colour pathway, we can directly assess the contribution of different channels in this calculation. We will also measure brain activity directly, again using EEG, to help us understand the timecourse of the neural operations that govern binocular eye movement planning. We will also repeat some of the above experiments in patients with amblyopia, a disorder of binocular vision in which one eye contributes much less to vision than the other. Although we know much about the consequences of amblyopia for perception, it may be that other binocular pathways remain unaffected. Understanding this will aid the development of treatments for this condition in the future.

A final study will use state-of-the art brain scanners (MRI and MEG) to measure the response of the brain to images shown to either one or two eyes, using a 3D projector. In the perceptual regions of the brain, the increased response when both eyes are open is balanced by a process of suppression between the eyes. This means that the brain activity is about the same whether one or both eyes see the stimulus, consistent with our everyday observation that the world does not change in appearance when two eyes are open compared to one. But we suspect that in the other binocular pathways, there might be a much bigger increase when both eyes are stimulated. So, we will look for brain regions that give a larger response to binocular stimulation than to monocular stimulation. We will do this for a range of different stimuli that are designed to target specific pathways (e.g. the motion pathway).

Technical Summary

Binocular combination occurs in primary visual cortex (V1), and results in a cyclopean percept of the external world. But the brain also combines information across the eyes in other anatomically distinct pathways that bypass the canonical V1 route, and which govern other functions including pupil diameter, and responses to rapid motion. Although the anatomical connections are known, the functional properties of these non-canonical binocular pathways have not been well established. This project will use an interdisciplinary, multimodal approach to investigate the computational algorithms involved non-canonical binocular signal combination. We will use fMRI and MEG to compare monocular and binocular responses across the whole brain. In V1, gain control processes balance excitation and inhibition, and binocular and monocular responses are equal. But in the non-canonical pathways, we anticipate much greater binocular responses, in some areas more than twice the monocular response. Next we will use a combination of pupillometry and steady-state EEG to measure the functional properties of the subcortical pathway that governs pupil diameter, and compare this to cortical responses. By targeting specific photoreceptor classes (e.g. melatonin, S-cones) we can investigate binocular interactions within and between pathways. We will also explore the contributions of different chromatic pathways to reflexive eye movements, because there is a direct (koniocellular) pathway that projects directly to MT and is not constrained by the combination rules of V1. Finally, we will test individuals with amblyopia on a subset of experimental paradigms to determine if binocular combination remains intact in non-canonical binocular pathways; this may be key to developing effective treatments for this condition. All experiments will be informed by the use of contemporary computational models of binocular vision, which provide a single point of comparison across distinct anatomical pathways.
 
Title Binocular facilitation of the BOLD response to melanopsin stimulation in the suprachiasmatic area 
Description In a recent analysis of archival data, Spitschan and Cajochen (2019) identify what appears to be substantial binocular facilitation of melatonin suppression due to melanopic light stimulation. This putative effect likely originates in the melanopsin-containing intrinsically photosensitive retinal ganglion cells (ipRGCs) which project directly to the suprachiasmatic nucleus (SCN) of the hypothalamus. We asked whether we could measure a direct physiological correlate of this binocular facilitation using a binocular, MRI-compatible, 10-primary spectral stimulation device. We present preliminary findings from a functional magnetic resonance imaging (fMRI) study designed to explore the blood oxygen level dependent (BOLD) response to monocular and binocular melanopic light stimulation. The study used a 30 s on/off design with three 'ocularity' conditions (binocular-low, monocular-high, binocular-high) and two classes of targeted photoreceptors (melanopsin and LMS cones). Throughout each scan, subjects (N=18) also responded to brief, cone-directed sinusoidal modulations of varying intensity. We report that binocular vs. monocular melanopsin stimulation induced significant BOLD activation in SCN but that this effect was not seen for cone-directed stimulation. This is consistent with the binocular facilitation effect described by Spitschan and Cajochen (2019) and provides the first direct evidence of melanopsin-driven activation and binocular facilitation in human subcortical nuclei. 
Type Of Art Film/Video/Animation 
Year Produced 2023 
URL https://figshare.com/articles/poster/Binocular_facilitation_of_the_BOLD_response_to_melanopsin_stimu...
 
Title Binocular facilitation of the BOLD response to melanopsin stimulation in the suprachiasmatic area 
Description In a recent analysis of archival data, Spitschan and Cajochen (2019) identify what appears to be substantial binocular facilitation of melatonin suppression due to melanopic light stimulation. This putative effect likely originates in the melanopsin-containing intrinsically photosensitive retinal ganglion cells (ipRGCs) which project directly to the suprachiasmatic nucleus (SCN) of the hypothalamus. We asked whether we could measure a direct physiological correlate of this binocular facilitation using a binocular, MRI-compatible, 10-primary spectral stimulation device. We present preliminary findings from a functional magnetic resonance imaging (fMRI) study designed to explore the blood oxygen level dependent (BOLD) response to monocular and binocular melanopic light stimulation. The study used a 30 s on/off design with three 'ocularity' conditions (binocular-low, monocular-high, binocular-high) and two classes of targeted photoreceptors (melanopsin and LMS cones). Throughout each scan, subjects (N=18) also responded to brief, cone-directed sinusoidal modulations of varying intensity. We report that binocular vs. monocular melanopsin stimulation induced significant BOLD activation in SCN but that this effect was not seen for cone-directed stimulation. This is consistent with the binocular facilitation effect described by Spitschan and Cajochen (2019) and provides the first direct evidence of melanopsin-driven activation and binocular facilitation in human subcortical nuclei. 
Type Of Art Film/Video/Animation 
Year Produced 2023 
URL https://figshare.com/articles/poster/Binocular_facilitation_of_the_BOLD_response_to_melanopsin_stimu...
 
Title Binocular facilitation of the BOLD response to melanopsin stimulation in the suprachiasmatic area 
Description In a recent analysis of archival data, Spitschan and Cajochen (2019) identify what appears to be substantial binocular facilitation of melatonin suppression due to melanopic light stimulation. This putative effect likely originates in the melanopsin-containing intrinsically photosensitive retinal ganglion cells (ipRGCs) which project directly to the suprachiasmatic nucleus (SCN) of the hypothalamus. We asked whether we could measure a direct physiological correlate of this binocular facilitation using a binocular, MRI-compatible, 10-primary spectral stimulation device. We present preliminary findings from a functional magnetic resonance imaging (fMRI) study designed to explore the blood oxygen level dependent (BOLD) response to monocular and binocular melanopic light stimulation. The study used a 30 s on/off design with three 'ocularity' conditions (binocular-low, monocular-high, binocular-high) and two classes of targeted photoreceptors (melanopsin and LMS cones). Throughout each scan, subjects (N=18) also responded to brief, cone-directed sinusoidal modulations of varying intensity. We report that binocular vs. monocular melanopsin stimulation induced significant BOLD activation in SCN but that this effect was not seen for cone-directed stimulation. This is consistent with the binocular facilitation effect described by Spitschan and Cajochen (2019) and provides the first direct evidence of melanopsin-driven activation and binocular facilitation in human subcortical nuclei. 
Type Of Art Film/Video/Animation 
Year Produced 2023 
URL https://figshare.com/articles/poster/Binocular_facilitation_of_the_BOLD_response_to_melanopsin_stimu...
 
Title Binocular facilitation of the BOLD response to melanopsin stimulation in the suprachiasmatic area 
Description In a recent analysis of archival data, Spitschan and Cajochen (2019) identify what appears to be substantial binocular facilitation of melatonin suppression due to melanopic light stimulation. This putative effect likely originates in the melanopsin-containing intrinsically photosensitive retinal ganglion cells (ipRGCs) which project directly to the suprachiasmatic nucleus (SCN) of the hypothalamus. We asked whether we could measure a direct physiological correlate of this binocular facilitation using a binocular, MRI-compatible, 10-primary spectral stimulation device. We present preliminary findings from a functional magnetic resonance imaging (fMRI) study designed to explore the blood oxygen level dependent (BOLD) response to monocular and binocular melanopic light stimulation. The study used a 30 s on/off design with three 'ocularity' conditions (binocular-low, monocular-high, binocular-high) and two classes of targeted photoreceptors (melanopsin and LMS cones). Throughout each scan, subjects (N=18) also responded to brief, cone-directed sinusoidal modulations of varying intensity. We report that binocular vs. monocular melanopsin stimulation induced significant BOLD activation in SCN but that this effect was not seen for cone-directed stimulation. This is consistent with the binocular facilitation effect described by Spitschan and Cajochen (2019) and provides the first direct evidence of melanopsin-driven activation and binocular facilitation in human subcortical nuclei. 
Type Of Art Film/Video/Animation 
Year Produced 2023 
URL https://figshare.com/articles/poster/Binocular_facilitation_of_the_BOLD_response_to_melanopsin_stimu...
 
Title PySilsub-a toolbox for silent substitution 
Description A normal human retina contains several classes of photosensitive cell-rods for low-light vision, three types of cones for daylight vision, and the intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for controlling non-image-forming functions (e.g., pupil size, circadian rhythms). The spectral sensitivities of the photoreceptors overlap significantly, meaning most lights will stimulate all photoreceptors, but to varying degrees. The method of silent substitution (Estévez & Spekreijse, 1982, Vision Research, 22[6], 681-691) provides a principled basis for stimulating individual photoreceptor classes selectively, which is useful in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub, a novel Python package for silent substitution featuring object-oriented support for individual colorimetric observer models, multi-primary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimisation. The software is registered with the Python Package Index (pip install pysilsub) and includes example data sets from various multi-primary systems. We hope that PySilSub will further encourage the application of silent substitution in research and clinical settings. 
Type Of Art Film/Video/Animation 
Year Produced 2022 
URL https://figshare.com/articles/poster/PySilsub_a_toolbox_for_silent_substitution/21711830/2
 
Title PySilsub-a toolbox for silent substitution 
Description A normal human retina contains several classes of photosensitive cell-rods for low-light vision, three types of cones for daylight vision, and the intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for controlling non-image-forming functions (e.g., pupil size, circadian rhythms). The spectral sensitivities of the photoreceptors overlap significantly, meaning most lights will stimulate all photoreceptors, but to varying degrees. The method of silent substitution (Estévez & Spekreijse, 1982, Vision Research, 22[6], 681-691) provides a principled basis for stimulating individual photoreceptor classes selectively, which is useful in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub, a novel Python package for silent substitution featuring object-oriented support for individual colorimetric observer models, multi-primary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimisation. The software is registered with the Python Package Index (pip install pysilsub) and includes example data sets from various multi-primary systems. We hope that PySilSub will further encourage the application of silent substitution in research and clinical settings. 
Type Of Art Film/Video/Animation 
Year Produced 2022 
URL https://figshare.com/articles/poster/PySilsub_a_toolbox_for_silent_substitution/21711830/4
 
Title PySilsub-a toolbox for silent substitution 
Description A normal human retina contains several classes of photosensitive cell-rods for low-light vision, three types of cones for daylight vision, and the intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for controlling non-image-forming functions (e.g., pupil size, circadian rhythms). The spectral sensitivities of the photoreceptors overlap significantly, meaning most lights will stimulate all photoreceptors, but to varying degrees. The method of silent substitution (Estévez & Spekreijse, 1982, Vision Research, 22[6], 681-691) provides a principled basis for stimulating individual photoreceptor classes selectively, which is useful in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub, a novel Python package for silent substitution featuring object-oriented support for individual colorimetric observer models, multi-primary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimisation. The software is registered with the Python Package Index (pip install pysilsub) and includes example data sets from various multi-primary systems. We hope that PySilSub will further encourage the application of silent substitution in research and clinical settings. 
Type Of Art Film/Video/Animation 
Year Produced 2022 
URL https://figshare.com/articles/poster/PySilsub_a_toolbox_for_silent_substitution/21711830/6
 
Title PySilsub-a toolbox for silent substitution 
Description A normal human retina contains several classes of photosensitive cell-rods for low-light vision, three types of cones for daylight vision, and the intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for controlling non-image-forming functions (e.g., pupil size, circadian rhythms). The spectral sensitivities of the photoreceptors overlap significantly, meaning most lights will stimulate all photoreceptors, but to varying degrees. The method of silent substitution (Estévez & Spekreijse, 1982, Vision Research, 22[6], 681-691) provides a principled basis for stimulating individual photoreceptor classes selectively, which is useful in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub, a novel Python package for silent substitution featuring object-oriented support for individual colorimetric observer models, multi-primary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimisation. The software is registered with the Python Package Index (pip install pysilsub) and includes example data sets from various multi-primary systems. We hope that PySilSub will further encourage the application of silent substitution in research and clinical settings. 
Type Of Art Film/Video/Animation 
Year Produced 2022 
URL https://figshare.com/articles/poster/PySilsub_a_toolbox_for_silent_substitution/21711830/1
 
Title PySilsub-a toolbox for silent substitution 
Description A normal human retina contains several classes of photosensitive cell-rods for low-light vision, three types of cones for daylight vision, and the intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for controlling non-image-forming functions (e.g., pupil size, circadian rhythms). The spectral sensitivities of the photoreceptors overlap significantly, meaning most lights will stimulate all photoreceptors, but to varying degrees. The method of silent substitution (Estévez & Spekreijse, 1982, Vision Research, 22[6], 681-691) provides a principled basis for stimulating individual photoreceptor classes selectively, which is useful in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub, a novel Python package for silent substitution featuring object-oriented support for individual colorimetric observer models, multi-primary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimisation. The software is registered with the Python Package Index (pip install pysilsub) and includes example data sets from various multi-primary systems. We hope that PySilSub will further encourage the application of silent substitution in research and clinical settings. 
Type Of Art Film/Video/Animation 
Year Produced 2022 
URL https://figshare.com/articles/poster/PySilsub_a_toolbox_for_silent_substitution/21711830/5
 
Title PySilsub-a toolbox for silent substitution 
Description A normal human retina contains several classes of photosensitive cell-rods for low-light vision, three types of cones for daylight vision, and the intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for controlling non-image-forming functions (e.g., pupil size, circadian rhythms). The spectral sensitivities of the photoreceptors overlap significantly, meaning most lights will stimulate all photoreceptors, but to varying degrees. The method of silent substitution (Estévez & Spekreijse, 1982, Vision Research, 22[6], 681-691) provides a principled basis for stimulating individual photoreceptor classes selectively, which is useful in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub, a novel Python package for silent substitution featuring object-oriented support for individual colorimetric observer models, multi-primary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimisation. The software is registered with the Python Package Index (pip install pysilsub) and includes example data sets from various multi-primary systems. We hope that PySilSub will further encourage the application of silent substitution in research and clinical settings. 
Type Of Art Film/Video/Animation 
Year Produced 2022 
URL https://figshare.com/articles/poster/PySilsub_a_toolbox_for_silent_substitution/21711830/3
 
Title PySilsub-a toolbox for silent substitution 
Description A normal human retina contains several classes of photosensitive cell-rods for low-light vision, three types of cones for daylight vision, and the intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for controlling non-image-forming functions (e.g., pupil size, circadian rhythms). The spectral sensitivities of the photoreceptors overlap significantly, meaning most lights will stimulate all photoreceptors, but to varying degrees. The method of silent substitution (Estévez & Spekreijse, 1982, Vision Research, 22[6], 681-691) provides a principled basis for stimulating individual photoreceptor classes selectively, which is useful in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub, a novel Python package for silent substitution featuring object-oriented support for individual colorimetric observer models, multi-primary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimisation. The software is registered with the Python Package Index (pip install pysilsub) and includes example data sets from various multi-primary systems. We hope that PySilSub will further encourage the application of silent substitution in research and clinical settings. 
Type Of Art Film/Video/Animation 
Year Produced 2022 
URL https://figshare.com/articles/poster/PySilsub_a_toolbox_for_silent_substitution/21711830
 
Description We developed a greater understanding of how the human brain combines signals from the left and right eyes. In particular, we focussed on brain areas that are not traditionally associated with binocular combination, including the pathway responsible for controlling the pupil, and those associated with circadian responses to light. By carefully controlling the inputs, and precisely measuring responses, we were able to derive accurate computational models of the relevant processes.
Exploitation Route The primary application of this work is in the development of novel near-eye display systems, including virtual reality headsets, smart glasses etc. Building such systems is technically challenging, and requires detailed knowledge of human binocular vision. The computational models we developed in this work (and in previous UKRI-funded projects) are currently being applied in industrial settings.

We also developed a toolbox for solving silent substitution problems, which is freely available and we anticipate will be widely used in research on colour science and circadian rhythms.
Sectors Digital/Communication/Information Technologies (including Software)

Healthcare

 
Description BBSRC IAA: Imaging human circadian rhythm networks
Amount £9,500 (GBP)
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 06/2022 
End 07/2023
 
Title Pysilsub toolbox 
Description PySilSub is a Python toolbox for performing the method of silent substitution in vision and nonvisual photoreception research. With PySilSub, observer- and device-specific solutions to silent substitution problems are found with linear algebra or numerical optimisation via a configurable, intuitive interface. 
Type Of Technology Software 
Year Produced 2023 
Open Source License? Yes  
Impact The toolbox is described in the following output: Martin, J.T., Boynton, G.M., Baker, D.H., Wade, A.R. & Spitschan, M. (2023). Psysilsub: An open-source Python toolbox for implementing the method of silent substitution in vision and nonvisual photoreception research, Journal of Vision, 23(7): 10, 1-16. 
URL https://github.com/PySilentSubstitution/pysilsub
 
Description CrowdScience episode broadcast on the BBC world service 
Form Of Engagement Activity A press release, press conference or response to a media enquiry/interview
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact Dr Lauren Welbourne, employed as a postdoctoral researcher on the project, was interviewed for an episode entitled "Do we all see the same colour?".
Year(s) Of Engagement Activity 2024
URL https://www.bbc.co.uk/programmes/w3ct4y5h
 
Description Soapbox science, "Two eyes, one image: how do our brains do it?" 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Public/other audiences
Results and Impact Postdoc Annie Morsi presented at a Soapbox Science event on 8th June 2024 in York. This spreads awareness about the research project.
Year(s) Of Engagement Activity 2024