LILiR2 - Language Independent Lip Reading
Lead Research Organisation:
University of Surrey
Department Name: Vision Speech and Signal Proc CVSSP
Abstract
It is known that humans can, and do, lip-read but not much is known about exactly what visual information is needed for effective lip-reading, particularly in non-laboratory environments. This project will collect data for lip-reading and use it to build automatic lip-reading systems: machines that convert videos of lip-motions into text. To be effective such systems must accurately track the head over a variety of poses; extract numbers, or features, that describe the lips and then learn what features correspond to what text. To tackle the problem we will need to use information collected from audio speech. So this project will also investigate how to use the extensive information known about audio speech to recognise visual speech.The project is a collaboration between the University of East Anglia who have previously developed state-of-the-art speech reading systems; the University of Surrey who built accurate and reliable face and lip-trackers and the Home Office Scientific Branch who wish to investigate the feasibility of this approach for crime fighting.
Publications
Dowson N
(2008)
Estimating the joint statistics of images using nonparametric windows with application to registration using mutual information.
in IEEE transactions on pattern analysis and machine intelligence
Lan Y
(2009)
Comparing visual features for lipreading
in AVSP-2009
Moore S
(2007)
Analysis and Modeling of Faces and Gestures
Moore S
(2010)
Image Analysis and Recognition
Moore S
(2011)
Local binary patterns for multi-view facial expression recognition
in Computer Vision and Image Understanding
Okwechime D
(2009)
Real-time motion control using pose space probability density estimation
Okwechime D
(2011)
MIMiC: Multimodal Interactive Motion Controller
in IEEE Transactions on Multimedia
Description | We developed methods for language identification. recognition of non verbal cues, accurate facial feature tracking and person dependant lip reading. |
Exploitation Route | We have already applied this work in a number of demonstration systems for both government and commercial clients. The tracking technology is currently in the process of being licenced. |
Sectors | Aerospace Defence and Marine Creative Economy Digital/Communication/Information Technologies (including Software) Government Democracy and Justice Retail |
URL | http://www.ee.surrey.ac.uk/Projects/LILiR/index.html |
Description | We are in the process of liscensing IPR generated from this project to a UK SME |
First Year Of Impact | 2012 |
Sector | Creative Economy |
Impact Types | Economic |
Description | MoD Phase 1 |
Amount | £114,453 (GBP) |
Organisation | Ministry of Defence (MOD) |
Sector | Public |
Country | United Kingdom |
Start | 03/2010 |
End | 03/2011 |
Description | MoD Phase 2 |
Amount | £140,522 (GBP) |
Organisation | Ministry of Defence (MOD) |
Sector | Public |
Country | United Kingdom |
Start | 03/2011 |
End | 03/2012 |
Description | MoD Phase 3 |
Amount | £100,090 (GBP) |
Organisation | Ministry of Defence (MOD) |
Sector | Public |
Country | United Kingdom |
Start | 03/2012 |
End | 03/2013 |
Description | MoD Phase 4 |
Amount | £100,000 (GBP) |
Organisation | Ministry of Defence (MOD) |
Sector | Public |
Country | United Kingdom |
Start | 03/2013 |
End | 03/2014 |
Title | LILIR Datasets |
Description | Several audio visual datasets were captured and released |
Type Of Material | Database/Collection of data |
Year Produced | 2010 |
Provided To Others? | Yes |
Impact | These datasets were and still are fundamental in our work in terms of benchmarking techniques. They have also been used by the wider research community. |
URL | http://www.ee.surrey.ac.uk/Projects/LILiR/datasets.html |
Description | UEA |
Organisation | University of East Anglia |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | This was a joint research project between the two institutions. However, I only report outcomes from our half of the project |
Collaborator Contribution | This was a joint research project between the two institutions. |
Impact | We have published many joint papers. We have also received 5 years work of funding from UK gov to continue the work. |
Start Year | 2007 |
Title | A new approach to rotoscoping using linear predictor tracking |
Description | Under a KTA we extended the linear tracker code developed in this project into a tool for the post production industry |
IP Reference | |
Protection | Copyrighted (e.g. software) |
Year Protection Granted | |
Licensed | Commercial In Confidence |
Impact | . |