Computation Lighting in Video
Lead Research Organisation:
University of Surrey
Department Name: Vision Speech and Signal Proc CVSSP
Abstract
Computation Lighting in Video. This project will explore how advances in Generative networks and deep-learning
approaches to beautification and stylistic transfer, can be combined to artificially recreate the appearance of professional
lighting. The system will use recordings from one or more viewpoints of a scene with an unknown lighting setup, and will
generate a video of the same scene but which a viewer would judge to have been recorded under a three point
lighting. The developed technology is of interest as a potential extension to the capabilities of the Ed system currently
being developed by BBC R&D.
approaches to beautification and stylistic transfer, can be combined to artificially recreate the appearance of professional
lighting. The system will use recordings from one or more viewpoints of a scene with an unknown lighting setup, and will
generate a video of the same scene but which a viewer would judge to have been recorded under a three point
lighting. The developed technology is of interest as a potential extension to the capabilities of the Ed system currently
being developed by BBC R&D.
Studentship Projects
Project Reference | Relationship | Related To | Start | End | Student Name |
---|---|---|---|---|---|
EP/T517616/1 | 01/10/2019 | 30/09/2025 | |||
2455617 | Studentship | EP/T517616/1 | 01/09/2020 | 30/09/2024 | Nikolina Kubiak |
Description | The most significant achievement would be developing a self-supervised relighting system that outperforms SOTA methods retrained on the same task&data (published in BMVC'21). The system is good at adjusting the lighting colour and temperature but does not deal well with shadows. To address this, we developed a shadow removal model. Its performance reached SOTA levels yet could not quite beat that. To fix this, we are currently working on a shadow detection solution that will produce shadow masks that can be used for extra guidance in the shadow removal process. Once this work finishes, we would hope to combine all 3 components and get an all-around relighting solution. To summarise, 2.5 years into the PhD (with 4 years of funding) we have managed to propose a preliminary solution to the problem and we are currently working on its improvements. |
Exploitation Route | Further work on automating relighting could be undertaken. This could include capturing and releasing a large-scale dataset of full-bodied humans in complex environments. The work could also explore the use of other generative models for the task of relighting - eg diffusion models. |
Sectors | Digital/Communication/Information Technologies (including Software),Electronics |
Description | This PhD project is an iCase which was set out with a clear application in mind. Automating the relighting task means that the BBC (our industrial partner) can record a wider range of programmes - not only the big profitable shows, games, etc but also smaller performances or panel shows. Instead of professionally lighting the events, they can be captured under a very basic setup or under whatever lighting is available in the venue, and then beautified in post production using our model. This is not meant to replace the current lighting crews (and impact their jobs) but rather expand the range of cultural, educational or other local programmes delivered by the BBC. |
First Year Of Impact | 2023 |
Sector | Creative Economy,Digital/Communication/Information Technologies (including Software) |
Impact Types | Cultural,Societal |
Description | Surrey + BBC partnership |
Organisation | British Broadcasting Corporation (BBC) |
Department | BBC Research & Development |
Country | United Kingdom |
Sector | Public |
PI Contribution | BBC R&D has done some research on automating various aspects of post-production. One of the elements of this was developing an auto-framing deep-learning system that takes full (as in, unframed) scene from a programme and cuts them into nice shots/frames according to standard viewer preferences. My work on relighting is imagined as an extension of the auto-framing project, meant to address another post-production task and, as a result, help to expand the range of events that the BBC can capture and broadcast. |
Collaborator Contribution | The industrial collaborators have provided insight into the world of media, eg information about filming or lighting preferences that we should keep in mind during the development of the tool. These comments also help to keep the work focused on the target domain - humans - as opposed to drifting away towards arbitrary solution. The collaboration also involves me going to the BBC offices for 1.5y of my PhD to get some exposure to the type of work the R&D dept of a large media company is responsible for. |
Impact | See: publications + outreach activities submission boxes |
Start Year | 2020 |
Title | Code for the SILT: Self-supervised Lighting Transfer Using Implicit Image Decomposition paper |
Description | The code for training and evaluation of the model proposed in the "SILT: Self-supervised Lighting Transfer Using Implicit Image Decomposition" paper (BMVC 2021). |
Type Of Technology | Software |
Year Produced | 2021 |
Open Source License? | Yes |
Impact | I created a self-supervised model that beat existing supervised systems, which in turn allowed me to publish my 1st paper. |
URL | https://github.com/n-kubiak/SILT |
Description | Automatic relighting blog post (BBC blog) |
Form Of Engagement Activity | A magazine, newsletter or online publication |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | In June 2022 I published a blog post on the BBC blog. The post talks about the need for automating relighting and fixing any illumination flaws in post-production. Unlike most of my work and presentations, this post was aimed at the general public (as opposed to R&D employees/academics/postgrads). Therefore, I had to figure out a way of describing my work to lay people in enough detail to fully express its complexity yet without overwhelming the casual reader. |
Year(s) Of Engagement Activity | 2022 |
URL | https://www.bbc.co.uk/rd/blog/2022-06-lighting-post-production-automation-machine-learning |
Description | BMVA Spring Symposium 2022 (Presentation) |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Postgraduate students |
Results and Impact | I was invited to talk about my BMVC 2021 paper (and work up to date) at the BMVA Spring Symposium in Manchester (April 2022). Since BMVC 2021 happened virtually, this was my first ever opportunity to present my work in front of a large group of people. It was also the first time when people outside of my supervision team/research group could ask me questions about my work, and provide comments on my work. Further discussion occurred during the poster session the next day, where I had a chance to talk to people about my work 1-on-1. Seeing how other people perceive my work helped me realise what's worth highlighting while describing my project, and what's perhaps too much detail for a casual academic working on an unrelated comp vis problem. Having presented my work once, I now feel more confident in other presentations. |
Year(s) Of Engagement Activity | 2022 |
Description | DSRP talk |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Industry/Business |
Results and Impact | DSRP is the name of a research partnership between the BBC and a number of universities. In December 2022 I gave a talk to DSRP members - both from within the BBC and from other academic circles - about my work up to date. This was predominantly focused on my BMVC'21 paper but also covered more recent work on shadow detection and removal in less detail. The event took place on Zoom. |
Year(s) Of Engagement Activity | 2022 |
Description | R&D Bites presentation |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Industry/Business |
Results and Impact | In January 2022 I gave a talk to the members of BBC R&D; this took place over Zoom. The talk described the aims & motivation of our research project and its potential use cases for the BBC (or outside). The technical part of the talk was focused on my work up to date, mostly the relighting solution proposed in the BMVC'21 paper, and a brief overview of our future work plans. Before this talk, I had only presented my work to people from within my university, so this was a valuable opportunity to talk to people from an industrial background. What surprised me about this talk was the completely different questions I got from the industrial participants - more focused on direct use cases or application to other similar problems as opposed to technical details, model specifics, etc. |
Year(s) Of Engagement Activity | 2022 |