Rank based spectral estimation

Lead Research Organisation: University College London
Department Name: Institute of Ophthalmology

Abstract

The colours, or RGB pixels, recorded by a digital camera are the result of the interaction of the prevailing light in the scene striking and being reflected by objects and the characteristics of the camera itself. The complexity is such that different cameras see differently and no cameras see the world exactly as we do. You will have noticed this when looking at photos where sometimes the colours don't look right or the pictures captured by one camera look 'better' than another. Moreover, sometimes we see colours change dramatically. We have all probably observed that white clothes can look bluish under ultra violet light (say in a night club). But, in fact the colours we see change subtly, all the time, as we move from one light to another (which is why it is always a good idea to check the colour of your clothes outside the shop). Here, even small changes can lead to poor customer satisfaction or, potentially, in a medical imaging application the wrong diagnosis.

Good pictures, by which we might mean accurate 'colour measurement' are possible if we know the spectral colour characteristics of a camera and/or the spectrum of light in a scene. While we can, in principle, measure these quantities the measurement is not easy to do so and is expensive (not easy as it requires considerable (Physics) lab time and expensive because spectral measurement devices cost many thousands of pounds). When measurement is not feasible, there do in fact exist methods for estimating (say) the spectrum of light in a scene. Yet, these methods only tend work if the camera is accurately calibrated first (a sort of chicken and the egg situation). Our 'Rank Based Spectral Estimation' Project aims to make it much easier to calibrate a camera or measure the illuminant in situ (and as such also make it easier to measure reflectance too)

So, how does our method work. Well suppose we gave you 50 grey tiles all of which appeared to have a different brightness. It would be an easy task for you to rank them from darkest to brightest. But, now suppose we change the colour of the light. Depending on the spectral shape of the grey reflectances, the ranking order can change (sometimes considerably). No problem, it is a simple matter to reorder the tiles. Remarkably, for specially chosen reflectances, the rank order will strongly correlate with the spectral shape of the light. Thus a simple ranking experiment gives us a strong clue to the colour of the light. (And, if we knew the colour of the light we could, for example predict whether the colour of our clothes might change when we go outdoors.)

The Rank Based Spectral Estimation project aims to take this simple ranking idea and provide simple, and accurate, estimation tools for deriving the spectral shape of the prevailing light, the spectral characteristics of a camera and the spectral reflectances of surfaces. At the heart of our method is a specially designed reflectance target containing many reflectances (whose design is part of the proposed research). Ranking these reflectances will allow us to accurately estimate the light spectrum and the spectral attributes of a camera. Accurate spectral estimates are required in many applications from photography, through, visual inspection to forensic imaging and telepresence (e.g. remote diagnosis).

Remarkably, we believe the methods we develop will also prove useful in understanding how we see. Indeed, it is very likely that you see the world a little differently than I do. Yet estimating an individual's spectral response is notoriously difficult. To the extent it can be done at all, it requires many hours of (tedious) detailed visual experiments. Through ranking it will be possible to uncover an observers spectral response (technically called 'colour matching curves') quickly and simply. We simply ask the observer to carry out a simple ranking of the kind mentioned above.

Publications

10 25 50
 
Description Human colour perception depends initially on the responses of the long(L-), middle(M-) and short(S-) wavelength-sensitive cones. These signals are then transformed post-receptorally into cone-opponent (L-M and S-(L+M)) and colour-opponent (red/green and blue/yellow) signals and perhaps at some later stage into categorical colour signals.
Here, we investigate the transformations from the cone spectral sensitivities to the hypothetical internal representations of 8 colour categories by applying a novel technique known as "Rank-Based Spectral Estimation". Pairs of colours were ranked by 12 observers according to which appeared more representative of eight different colour categories: red-green-blue-yellow-pink-purple-brown-orange. Stimuli comprised circular patches of 32 colours presented on a CRT monitor chosen to cover as large a volume of LMS colour space as possible. In separate blocks, observers judged pairs of colours as to which appeared more like the reference colour name. Pairs for which judgement could not be made, because neither colour appeared like the reference, were recorded but not used.
To derive the spectral sensitivities of the colour categories (the 8 "colour sensors") using the rank-based technique, we assumed that the relationship between cone responses and colour appearance can be described by a linear transform followed by a rank-preserving non-linearity. The estimated sensor transformations could account for over 85% of the rank orders. Sensor shapes were generally plausible; those for red and green were consistent across observers, while the yellow and blue ones showed more variability in spectral position. Other sensors, such as brown, pink and purple, showed large inter-observer variability, which might be due in part to cultural differences in colour naming. Sensors were generally restricted to limited regions of colour space. As expected from colour opponent theory, the red and green sensors formed relatively distinct regions with limited overlap as did the yellow and blue ones. Other sensors were spectrally shifted or bimodal.
Exploitation Route We are still writing up and modelling these data. One fruitful new area of research will be to vary colours in the surround or on a background to see how they alter colour sensors operating on a central patch of colour.
Sectors Other

 
Title CVRL database 
Description This web resource provides an annotated database of downloadable standard functions and data sets relevant to colour and vision research and to colour technology, as well as providing information about the research outputs of our group. Updated frequently. 
Type Of Material Database/Collection of data 
Year Produced 2006 
Provided To Others? Yes  
Impact Widely used in science and industry, the site started at UC San Diego in 1995 and moved to UCL with the PI in 2001. 
URL http://www.cvrl.org
 
Description BBC World Service, CrowdScience participant. 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Public/other audiences
Results and Impact CrowdScience participant as an expert on colour vision.
Year(s) Of Engagement Activity 2018
 
Description PI was chair and co-chair of the Colour Group GB 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact The Colour Group GB organizes public meetings, school lectures and events on the broad topic of colour.

Wider interest and appreciation of the scientific and artistic aspects of colour.
Year(s) Of Engagement Activity 2009,2010,2011,2012,2013,2014
URL http://www.colour.org.uk
 
Description Participation in Bloomsbury 2020 Arts festival and production of online visual illusions. Replayed 2021 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Public/other audiences
Results and Impact Production of three visual illusion films for festival as an online presentation because of COVID
Year(s) Of Engagement Activity 2020,2021
URL https://bloomsburyfestival.org.uk/2020vision/
 
Description Public lecture on Human Colour Vision IOP Canterbury 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Professional Practitioners
Results and Impact Public lecture on Human Colour Vision IOP Canterbury
Year(s) Of Engagement Activity 2016
URL http://www.iop.org
 
Description Public lecture on Human Colour Vision IOP London 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Professional Practitioners
Results and Impact Colour Vision
Public lecture at the Institute of Physics London
Year(s) Of Engagement Activity 2016
URL http://www.iop.org/
 
Description Public lecture on Human Colour Vision IOP Open University 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Professional Practitioners
Results and Impact Colour vision
Invited public lecture, Institute of Physics, Open University, Milton Keynes
Year(s) Of Engagement Activity 2016
URL http://www.iop.org/
 
Description Short course instructor, 26th Color and Imaging Conference, Vancouver. 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Class and workshop in colour and colour vision for people in Colour and imaging. Important for forging links with industry.
Year(s) Of Engagement Activity 2018
 
Description TEDxUAL talk on Color Vision 
Form Of Engagement Activity A broadcast e.g. TV/radio/film/podcast (other than news/press)
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact TEDxUAL speaker, University of the Arts London. On-line
Year(s) Of Engagement Activity 2016
URL http://www.tedxual.com/