Vision for the Future

Lead Research Organisation: University of Bristol
Department Name: Electrical and Electronic Engineering

Abstract

Approximately half the cortical matter in the human brain is involved in processing visual information, more than for all of the other senses combined. This reflects the importance of vision for function and survival but also explains its role in entertaining us, training us and informing our decision-making processes. However, we still understand relatively little about visual processes in naturalistic environments and this is why it is such an important research area across such a broad range of applications.

Vision is important: YouTube video accounts for 25% of all internet traffic and in the US, Netflix accounts for 33% of peak traffic; by 2016 video is predicted by CISCO to account for 54% of all traffic (86% if P2P video distribution is included) where the total IP traffic is predicted to be 1.3 zettabytes. Mobile network operators predict a 1000 fold increase in demand over the next 10 years driven primarily by video traffic. At the other extreme, the mammalian eye is used by cheetahs to implement stable locomotion over natural terrain at over 80km/h and by humans to thread a needle with sub-millimetre accuracy or to recognise subtle changes in facial expression. The mantis shrimp uses 12 colour channels (humans use only three) together with polarisation and it possesses the fastest and most accurate strike in the animal kingdom.

Vision is thus central to the way animals interact with the world. A deeper understanding of the fundamental aspects of perception and visual processing in humans and animals, across the domains of immersion, movement and visual search, coupled with innovation in engineering solutions, is therefore essential in delivering future technology related to consumer, internet, robotic and environmental monitoring applications.

This project will conduct research across three interdisciplinary strands: Visual Immersion, Finding and Hiding Things, and Vision in Motion. These are key to understanding how humans interact with the visual world. By drawing on knowledge and closely coupled research across computer science, electronic engineering, psychology and biology we will deliver radically new approaches to, and solutions in, the design of vision based technology.

We recognise that it is critical to balance high risk research with the coherence of the underlying programme. We will thus instigate a new sandpit approach to ideas generation where researchers can develop their own mini-projects. This will be aligned with a risk management process using peer review to ensure that the full potential of the grant is realised. The management team will periodically and when needed, seek independent advice through a BVI Advisory panel.

Our PDRAs will benefit in ways beyond those on conventional grants. They will for example be mentored to:
i) engage in ideas generation workshops, defining and delivering their own mini-projects within the programme;
ii) develop these into full proposals (grants or fellowships) if appropriate;
iii) undertake secondments to international collaborator organisations, enabling them to gain experience of different research cultures;
iv) lead the organisation of key events such as the BVI Young Researchers' Colloquium; v) be trained as STEM ambassadors to engage in outreach activities and public engagement; and
vii) explore exploitation of their intellectual property.
Finally we will closely link BVI's doctoral training activities to this grant, providing greater research leverage and experience of research supervision for our staff.

Planned Impact

Vision is central to the way humans interact with the world. A deeper understanding of the fundamental aspects of human perception and visual processing in humans and animals, will lead to innovation in engineering solutions. Our programme will therefore be instrumental in delivering future technology related to consumer, internet, robotic and environmental monitoring applications.

Through a closely coupled research programme across engineering, computer science, psychology and biology, this grant will deliver in each of these areas. Firstly, this research will be relevant to research communities across disciplines: it will benefit psychologists in generating realistic real world scenarios and data sets and results which help us to understand the way humans interact with the visual world; It will benefit biologists in providing visual models for understanding the evolution and ecology of vision; it will benefit engineers and computer scientists in providing radically new approaches to solving technology problems.

The research in Visual Immersion will be of great significance to the ICT community commercially in terms of future video acquisition formats, new compression methods, new quality assessment methods and immersive measurements. This will inform the future of immersive consumer products - 'beyond 3D'. In particular the project will deliver an understanding of the complex interactions between video parameters in delivering a more immersive visual experience. This will not only be relevant to entertainment, but also in visual analytics, surveillance and healthcare. Our results are likely to inform future international activity in video format standardisation in film, broadcast and internet delivery, moving thinking from 'end to end solutions' to the 'creative continuum' where content creation, production delivery, display, consumption and quality assessment, are all intimately interrelated. Our work will also help us to understand how humans interact with complex environments, or are distracted by environmental changes - leading to better design of interfaces for task based operations and hence improved situational awareness.

In terms of Finding and Hiding Things - impact will be created in areas such as visual camouflage patterns, offering a principled design framework which takes account of environmental factors and mission characteristics. It will also provide enhanced means of detecting difficult targets, through better understanding of the interactions between task and environment. It will provide benefits in application areas such as situational awareness and stealthy operation - highly relevant to surveillance applications. The work will also contribute in related areas such as environmental visual impact of entities such as windfarms, buildings or pylons. Hence the relevance of the research to energy providers and civil engineers. Finally, visual interaction with complex scenes is a key enabler for the 'internet of things'.

In the case of Vision in Motion, the research will deliver impact in the design of truly autonomous machines, exploiting our understanding of the way in which animals and humans adapt to the environment. The beneficiaries in this case will be organisations in the commercial, domestic and surveillance robotics or UAV sectors. Furthermore, understanding the interactions between motion and camouflage has widespread relevance to environmental applications and to anomaly detection. Through a better understanding of the effects of motion we can design improved visual acquisition methods, better consumer interfaces, displays and content formats. This will be of broad benefit across the ICT sector, with particular relevance to designers of visual interfaces and to content providers in the entertainment sector. Furthermore the research will benefit those working in healthcare - for example in rehabilitation or in the design of point of care systems incorporating exocentric vision systems
 
Description I) Spatio temporal resampling combined with superresolution upsampling as a basis for perceptual video compression; ii) the benefits of using polarization imagery for feature extraction; iii) How video content features can be used to predict rate-quality performance; New methods for B-Line extraction from lung ultrasound; iv) The use of polarization vision in humans as an indicator of Age Related Macular Degeneration; iii) The limits of temporal resolution in high frame rate video acquisition.
Exploitation Route VisTRA codec has been submitted for consideration by MPEG; Polarisation in AMD is the basis for start up Azul Optics
Sectors Aerospace, Defence and Marine,Creative Economy,Digital/Communication/Information Technologies (including Software),Education,Healthcare,Leisure Activities, including Sports, Recreation and Tourism,Manufacturing, including Industrial Biotechology,Retail,Security and Diplomacy,Transport

 
Description Exploited in spin off Azul optics
First Year Of Impact 2016
Sector Healthcare
Impact Types Societal,Economic

 
Description BBSRC / RSE Innovation Fellowship
Amount £45,000 (GBP)
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 04/2016 
End 04/2017
 
Description EPSRC IAA Immersive Measurements
Amount £20,000 (GBP)
Organisation University of Bristol 
Sector Academic/University
Country United Kingdom
Start 08/2017 
End 12/2017
 
Description EPSRC IAA ViSTRA
Amount £20,000 (GBP)
Organisation University of Bristol 
Sector Academic/University
Country United Kingdom
Start 12/2017 
End 10/2018
 
Description FTMA2
Amount £15,000 (GBP)
Organisation University of Bristol 
Sector Academic/University
Country United Kingdom
Start 06/2019 
End 09/2019
 
Description Impact Acceleration
Amount £150,000 (GBP)
Organisation University of Bristol 
Sector Academic/University
Country United Kingdom
Start 09/2015 
End 09/2016
 
Description Leverhulme early career fellowship - A. Katsenou
Amount £90,000 (GBP)
Organisation The Leverhulme Trust 
Sector Academic/University
Country United Kingdom
Start 03/2018 
End 02/2021
 
Description Netflix
Amount £50,000 (GBP)
Organisation Netflix, Inc. 
Start 03/2018 
End 03/2019
 
Description YouTube Faculty research Award
Amount £40,000 (GBP)
Organisation YouTube 
Sector Private
Country United States
Start 07/2017 
End 09/2020
 
Title BV High frame rate database 
Description Collection of high frame rate clips with associated metadata for testing and developing future immersive video formats 
Type Of Material Database/Collection of data 
Year Produced 2015 
Provided To Others? Yes  
Impact None at present 
URL http://data.bris.ac.uk/data/dataset/k8bfn0qsj9fs1rwnc2x75z6t7
 
Title BVI Texture database 
Description Collection of static and dynamic video textures for compression testing 
Type Of Material Database/Collection of data 
Year Produced 2015 
Provided To Others? Yes  
Impact Used by several groups around the world 
URL http://data.bris.ac.uk/datasets/1if54ya4xpph81fbo1gkpk5kk4/
 
Title HomTex 
Description A database of homogeneous texture video clips 
Type Of Material Database/Collection of data 
Year Produced 2017 
Provided To Others? No  
Impact Led to feature - based content coding methods and secondment of Mariana Afonso and Felix Mercer Moss to Netflix. Contributed to a new strategic relationship with Netflix. 
URL https://data.bris.ac.uk/data/dataset/1h2kpxmxdhccf1gbi2pmvga6qp
 
Description BBC Immersive Technology Laboratory 
Organisation British Broadcasting Corporation (BBC)
Department BBC Research & Development
Country United Kingdom 
Sector Public 
PI Contribution High Dynamic range coding optimisation for HEVC Perceptual video compression results REDUX database analytics
Collaborator Contribution Provision of REDUX Support for PhD students Collaboration on perceptual quantisation Secondment of BBC employees
Impact New method of perceptual quantisation for HDR HEVC Analysis of BBC archive in terms of feature classification
Start Year 2012
 
Description BBC Strategic Research Partnership in Data Science 
Organisation Cytec Industries
Department R&D
Country United States 
Sector Private 
PI Contribution Partner in DSRP with BBC - contributing academic inputs, collaborative research and steering board membership.
Collaborator Contribution Currently 2 x ICASE awards to Bull and Gilchrist; data sets and expertise.
Impact 2 x iCASE awards to Bull and Gilchrist, engagement in programme grant submission; first ever evaluation of the added value of video format over narrative.
Start Year 2017
 
Description Google Faculty research Award 
Organisation YouTube
Country United States 
Sector Private 
PI Contribution Development and enhancement of ViSTRA codec - Intelligent and perceptual resampling and superresolution.
Collaborator Contribution Financial contribution.
Impact ViSTRA patent application and MPEG submission
Start Year 2017
 
Description Immersive Assessments 
Organisation Aarhus University
Country Denmark 
Sector Academic/University 
PI Contribution Collaboration with Aarhus Univ on development of Immersive assessment methods.
Collaborator Contribution Ongoing collaboration
Impact None yet - ongoing
Start Year 2016
 
Description Netflix collaboration 
Organisation Netflix, Inc.
PI Contribution Video codec research, perceptual metrics and dynamic optimisiation
Collaborator Contribution Data set access, shared resources and expertise.
Impact Characterisation and enhancement of perceptual metrics; performance comparisons AV1 vs HEVC.
Start Year 2018
 
Title Video Processing Method (ViSTRA) 
Description Optimisation of video codec using in loop perceptual metrics and superresolution upscaling 
IP Reference P123219GB 
Protection Patent application published
Year Protection Granted
Licensed No
Impact Submitted to MPEG Beyond HEVC
 
Company Name Azul Optics 
Description Exploiting polarisation vision in humans to detect Age Related Macular Degeneration. Established by BVI Platform Grant researcher Shelby Temple based on work partially completed under the grant. 
Year Established 2016 
Impact None yet - product under development
Website http://azuloptics.com/
 
Description AHRC Beyond Conference Keynote 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Policymakers/politicians
Results and Impact Keynote Lecture at AHRC Beyond Conference to launch the Creative Industries ISCF collaboration.
Year(s) Of Engagement Activity 2018
 
Description Keynote lecture at conference 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact Invited keynote lecture, Chinese Ornithological Congress, Xian, China, 22-25 September, 2015. "What camouflage tells us about avian perception and cognition"
Year(s) Of Engagement Activity 2017
 
Description Keynote: EPSRC VIHM Workshop 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Keynote lecture EPSRC Vision in Humans and Machines Workshop Bath 2016
Year(s) Of Engagement Activity 2016
 
Description Keynote: IET ISP 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Keynote Lecture IET ISP- Perceptual Video coding
Year(s) Of Engagement Activity 2015
 
Description Public engagement activity - Festival of Nature 2017 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Public/other audiences
Results and Impact "Nature expert" at event at the 2017 Festival of Nature, a 2-day free public event organised by the the Bristol Natural History Consortium (http://www.bnhc.org.uk/festival-of-nature/). I took part in "Nature Roulette" talking about animal coloration.
Year(s) Of Engagement Activity 2017
URL http://www.bnhc.org.uk/nature-roulette-will-meet/
 
Description Talk at local school 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Local
Primary Audience Schools
Results and Impact Talk to GCSE and lower 6th form students on animal camouflage, followed by presentation and discussion on careers in biology.
Year(s) Of Engagement Activity 2017
 
Description Talk on animal defensive coloration at the University of Groningen, The Netherlands 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Postgraduate students
Results and Impact Invited research talk to graduate students, undergraduates and postdocs at the School of Life Sciences, University of Groningen, The Netherlands
Year(s) Of Engagement Activity 2018