Structured machine listening for soundscapes with multiple birds
Lead Research Organisation:
Queen Mary University of London
Department Name: Sch of Electronic Eng & Computer Science
Abstract
In this Early Career fellowship I will establish a world-leading capability in automatic inference about songbird communications via novel "machine listening" methods, working collaboratively with experts in machine listening but also experts in bird behaviour and communication. Automatic analysis has already shown benefit to researchers in efficiently characterising recorded bird sounds, but there are still many limitations in applicability, such as when many birds sing together. The techniques developed will specifically be designed to handle noisy multi-source audio recordings, and to infer not just the presence of birds but the structure of the signals and the interactions between them. Such methods will be a leap beyond the current state of the art in bioacoustics, allowing researchers to study not just sounds recorded in the lab under controlled conditions, but also field recordings and archive recordings found in public audio archives.
I will develop my techniques through specific application case studies. First through collaboration with David Clayton, an international expert on zebra finch behaviour and genetics, who recently moved his lab to my proposed host institution. The zebra finch is an important "model organism" in biology, because its genome is fully sequenced and it is a useful bird for probing aspects of songbird vocal development. I will collaborate with the Clayton lab to develop methods for automatically inferring the social interactions implicit in audio recordings of zebra finch colonies. Second, I will conduct international research visits to collaborate with other research groups who analyse bird sounds and bird social interactions. Third, I will study the case of automatically detecting bird activity in arbitrary sound archives, such as the soundscape recordings held by the British Library Sound Archive.
Importantly, not only will I apply modern signal processing and machine learning techniques, but I will also develop new techniques inspired by this application area. This fellowship is not about contributing from one field to another, but about building up UK research strength in this cross-discplinary research topic. In order to make the most of this possibility, I will host research workshops and an open data contest to serve as focal points for research attention, and I will also conduct a public engagement initiative to engage the widest possible enthusiasm for this exciting field of possibility.
This fellowship directly aligns with the "Working Together" priority, which is EPSRC's current overriding priority for ICT fellowships.
I will develop my techniques through specific application case studies. First through collaboration with David Clayton, an international expert on zebra finch behaviour and genetics, who recently moved his lab to my proposed host institution. The zebra finch is an important "model organism" in biology, because its genome is fully sequenced and it is a useful bird for probing aspects of songbird vocal development. I will collaborate with the Clayton lab to develop methods for automatically inferring the social interactions implicit in audio recordings of zebra finch colonies. Second, I will conduct international research visits to collaborate with other research groups who analyse bird sounds and bird social interactions. Third, I will study the case of automatically detecting bird activity in arbitrary sound archives, such as the soundscape recordings held by the British Library Sound Archive.
Importantly, not only will I apply modern signal processing and machine learning techniques, but I will also develop new techniques inspired by this application area. This fellowship is not about contributing from one field to another, but about building up UK research strength in this cross-discplinary research topic. In order to make the most of this possibility, I will host research workshops and an open data contest to serve as focal points for research attention, and I will also conduct a public engagement initiative to engage the widest possible enthusiasm for this exciting field of possibility.
This fellowship directly aligns with the "Working Together" priority, which is EPSRC's current overriding priority for ICT fellowships.
Planned Impact
The prime beneficiaries outside my immediate field will be in research fields benefitting from the structured analysis of animal sounds and interactions. For example the improved techniques in zebra finch analysis will complement ongoing research into songbird genetics and individual differences, or research into conversational interactions in linguistics: the availability of more structured naturalistic data about animal communication could provide stronger empirical foundation to considerations of the evolution of communication systems. (This impact overlaps to some extent with that described in the Academic Beneficiaries section.)
The availability of these sound analysis techniques is also of interest to wildlife monitoring organisations such as the British Trust for Ornithology (BTO). They largely use manual surveying by volunteers and professionals to quantify the distributions of species: however, if high-quality automatic analysis were available their work could be made more efficient. Current academic and commercial software (examples include Raven, XBAT, Seewave, Praat, Sound Analysis Pro) allow users to inspect and detect bird sounds but are unable to analyse communication networks, nor can they use models of communication interactions to ensure high-quality detection. Analysing not just the presence of bird song and calls, but the networks of interaction between them, could be used as an indicator of the population health, reflecting issues such as habitat fragmentation which can impact the viability of a bird population. Downstream, detailed analysis of animal sound can thus form a strong evidence base for ecological policy decisions.
The application to audio archives shows another direct route to impact. Large audio collections such as those in the British Library Sound Archive are highly valuable to society, yet a lot of their value remains locked away because there is very little metadata associated that would facilitate different types of query. This research will directly enable the unlocking of some of this value, helping people to discover the presence of birds in large audio archives which may not be annotated for their bird sounds, indeed may have been collected for entirely different reasons.
The fellowship will also have an impact on the public understanding of bird sounds, bird social interactions, and signal processing and machine learning. These will be explicitly encouraged through the public engagement activities: through engaging talks, articles and exhibits I will aim, not to place the technology between the public and the birds, but to enchant the public with both the wonders of technology and the wonders of bird vocal communication.
The availability of these sound analysis techniques is also of interest to wildlife monitoring organisations such as the British Trust for Ornithology (BTO). They largely use manual surveying by volunteers and professionals to quantify the distributions of species: however, if high-quality automatic analysis were available their work could be made more efficient. Current academic and commercial software (examples include Raven, XBAT, Seewave, Praat, Sound Analysis Pro) allow users to inspect and detect bird sounds but are unable to analyse communication networks, nor can they use models of communication interactions to ensure high-quality detection. Analysing not just the presence of bird song and calls, but the networks of interaction between them, could be used as an indicator of the population health, reflecting issues such as habitat fragmentation which can impact the viability of a bird population. Downstream, detailed analysis of animal sound can thus form a strong evidence base for ecological policy decisions.
The application to audio archives shows another direct route to impact. Large audio collections such as those in the British Library Sound Archive are highly valuable to society, yet a lot of their value remains locked away because there is very little metadata associated that would facilitate different types of query. This research will directly enable the unlocking of some of this value, helping people to discover the presence of birds in large audio archives which may not be annotated for their bird sounds, indeed may have been collected for entirely different reasons.
The fellowship will also have an impact on the public understanding of bird sounds, bird social interactions, and signal processing and machine learning. These will be explicitly encouraged through the public engagement activities: through engaging talks, articles and exhibits I will aim, not to place the technology between the public and the birds, but to enchant the public with both the wonders of technology and the wonders of bird vocal communication.
Organisations
- Queen Mary University of London (Lead Research Organisation)
- QUEEN MARY UNIVERSITY OF LONDON (Collaboration)
- Charles University (Collaboration)
- Adam Mickiewicz University in Poznan (Collaboration)
- Max Planck Society (Collaboration)
- Wageningen University (Project Partner)
- University of Paris South (Paris XI) (Project Partner)
- UNIVERSITY OF CAMBRIDGE (Project Partner)
People |
ORCID iD |
Daniel Stowell (Principal Investigator / Fellow) |
Publications

Alvarado P
(2016)
Gaussian processes for music audio modelling and content analysis


Day G
(2022)
The supply-side climate policy of decreasing fossil fuel tax profiles: can subsidized reserves induce a green paradox?
in Climatic change

Ju T
(2023)
A new prediction method of industrial atmospheric pollutant emission intensity based on pollutant emission standard quantification.
in Frontiers of environmental science & engineering


Morfi V
(2018)
Deep Learning on Low-Resource Datasets

Morfi V
(2019)
NIPS4Bplus: a richly annotated birdsong audio dataset.
in PeerJ. Computer science

Morfi V
(2018)
NIPS4Bplus: a richly annotated birdsong audio dataset
Title | Birds Heard Through Bells |
Description | Collaboration with composer/roboticist Sarah Angliss, to create a musical score to render the sound of birdsong through her robotic bells (carillon). First shown at Listening in the Wild 2015 workshop. Second shown at "SoundCamp" in a park in London, on International Dawn Chorus Day. |
Type Of Art | Artefact (including digital) |
Year Produced | 2015 |
Impact | Public engagement with sound, computation and birds through a novel medium. Reached approx 100 people in public park. |
URL | https://vimeo.com/256459737 |
Title | Music track "Egg" by Sarah Angliss |
Description | Music track "Egg" on the album "Air Loom" by Sarah Angliss. As quoted in the credits: "Using an adaptation of the automatic birdsong transcriber created by Dan Stowell, QMUL (used with thanks)" |
Type Of Art | Composition/Score |
Year Produced | 2019 |
Impact | The artist has performed this live to various audiences - see e.g. review of Kings Place (London) event: https://www.icareifyoulisten.com/2019/04/uncanny-resonances-sarah-angliss-at-kings-place-london/ |
URL | https://sarahangliss.bandcamp.com/track/egg |
Description | We characterised for the first time, the fine detail of vocal interactions in groups of zebra finches. Zebra finches are an important species because they are used for many studies worldwide and are considered a "model species" for vocal learning. We demonstrated that some of the interaction details are reproducibly related to social relationships. The technical method we used for performing this characterisation is novel to the field of animal communication and is published as open source code. We also introduced new and improved machine learning methods for animal sound detection and classification, in particular for working with "low resource" data scenarios with limited training data. Dan also led the establishment of a research community in computational bioacoustics. Dan has published a tutorial chapter on the topic, led "data challenges" and "special sessions" at conferences to bring people together on the topic, and coordinated a special issue of a journal. Dan has also collaborated with junior and senior researchers from many different countries (FR, DE, PL, NL, GR, UK, FI, CZ), helping to spread the expertise and build bridges between computational audio analysis and bioacoustics/ethology. |
Exploitation Route | Many of the methods developed are published as open source code, and many of the datasets are published as open data. The approaches developed can be used for monitoring other birds or animals. |
Sectors | Environment Other |
URL | http://mcld.co.uk/research/ |
Description | Warblr bird recognition app: Dan, together with an external business partner, secured a £10K grant from QMUL's EPSRC Innovation Fund in summer 2014, which allowed them to work with developers to build a social enterprise involving the British public in a citizen science project to identify and collect bird sound recordings. Following that, the Warblr team launched a Kickstarter campaign, founded a spinout company, and in Summer 2015 launched the smartphone app. It has amassed over 5,000 paying users. To date, Warblr has had more than 45,000 submissions to its database, and an average of 80 submissions per day. |
First Year Of Impact | 2015 |
Sector | Environment |
Impact Types | Societal Economic |
Description | Lead author of evidence submission to UK parliament inquiry "Algorithms in decision making" |
Geographic Reach | National |
Policy Influence Type | Contribution to a national consultation/review |
URL | http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technol... |
Description | 2nd International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots |
Amount | £3,000 (GBP) |
Organisation | Alan Turing Institute |
Sector | Academic/University |
Country | United Kingdom |
Start | 01/2019 |
End | 08/2019 |
Description | Enabling worldwide solar PV nowcasting via machine vision and open data |
Amount | £50,000 (GBP) |
Funding ID | R-SPES-115 |
Organisation | Alan Turing Institute |
Sector | Academic/University |
Country | United Kingdom |
Start | 09/2019 |
End | 03/2020 |
Description | Machine Learning for Bird Song Learning |
Amount | £535,796 (GBP) |
Funding ID | BB/R008736/1 |
Organisation | Biotechnology and Biological Sciences Research Council (BBSRC) |
Sector | Public |
Country | United Kingdom |
Start | 05/2018 |
End | 06/2019 |
Description | QMUL Innovation Fund |
Amount | £10,000 (GBP) |
Organisation | Queen Mary University of London |
Sector | Academic/University |
Country | United Kingdom |
Start | 06/2014 |
End | 07/2015 |
Description | QMUL Innovation Fund Supplementary Award |
Amount | £10,000 (GBP) |
Organisation | Queen Mary University of London |
Sector | Academic/University |
Country | United Kingdom |
Start | 06/2015 |
End | 09/2015 |
Title | Bird Audio Detection: public data for DCASE Challenge 2018 Task 3 |
Description | Data labels for the public development data for Task 3 of the DCASE Challenge 2018 on "Bird Audio Detection". These annotations indicate the presence/absence of bird sounds in various datasets of 10-second audio clips. The label "hasbird" represents whether a sound clip contains any audible bird sound. Note that there may be other sounds such as weather or humans, occasionally loud - the label does not mean that bird sound predominates. For more info, and to download WAV audio data: http://dcase.community/challenge2018/task-bird-audio-detection |
Type Of Material | Database/Collection of data |
Year Produced | 2018 |
Provided To Others? | Yes |
Impact | DCASE 2018 Bird Audio Detection task (Task 3). http://dcase.community/challenge2018/task-bird-audio-detection |
URL | https://figshare.com/articles/Bird_Audio_Detection_public_data_for_DCASE_Challenge_2018_Task_3/60264... |
Title | Datasets for automatic acoustic identification of individual birds |
Description | Bird individual audio recordings (foreground and background) to accompany the work: "Automatic acoustic identification of individuals: Improving generalisation across species and recording conditions" by Dan Stowell, Tereza Petrusková, Martin Å álek, Pavel Linhart This dataset contains labelled recordings of individuals from three different bird species: Little owl Chiffchaff Tree Pipit For more information, please see the README.txt file, and the research article. The dataset takes approx 11 GB of disk space after the ZIP files have been uncompressed. |
Type Of Material | Database/Collection of data |
Year Produced | 2018 |
Provided To Others? | Yes |
Impact | "Automatic acoustic identification of individuals: Improving generalisation across species and recording conditions" by Dan Stowell, Tereza Petrusková, Martin Å álek, Pavel Linhart |
URL | https://doi.org/10.5281/zenodo.1413495 |
Title | Evaluation datasets for DCASE 2018 Bird Audio Detection |
Description | Evaluation data audio for the DCASE 2018 Bird Audio Detection task (Task 3). Crowdsourced dataset, UK ("warblrb10k") - a held-out set of 2,000 recordings from the same conditions as the Warblr development dataset. Remote monitoring dataset, Chernobyl ("Chernobyl") - 6,620 audio clips collected from unattended remote monitoring equipment in the Chernobyl Exclusion Zone (CEZ). This data was collected as part of the TREE (Transfer-Exposure-Effects) research project into the long-term effects of the Chernobyl accident on local ecology. The audio covers a range of birds and includes weather, large mammal and insect noise sampled across various CEZ environments, including abandoned village, grassland and forest areas. Remote monitoring night-flight calls, Poland ("PolandNFC") - 4,000 recordings from Hanna Pamu?a's PhD project of monitoring autumn nocturnal bird migration. The recordings were collected every night, from September to November 2016 on the Baltic Sea coast, Poland, using Song Meter SM2 units with microphones mounted on 3-5 m poles. For this challenge, we use a subset derived from 15 nights with different weather conditions and background noise including wind, rain, sea noise, insect calls, human voice and deer calls. |
Type Of Material | Database/Collection of data |
Year Produced | 2018 |
Provided To Others? | Yes |
Impact | DCASE 2018 Bird Audio Detection task (Task 3). http://dcase.community/challenge2018/task-bird-audio-detection |
URL | https://figshare.com/articles/Evaluation_datasets_for_DCASE_2018_Bird_Audio_Detection/6722846 |
Title | NIPS4Bplus: Transcriptions of NIPS4B 2013 Bird Challenge Training Dataset |
Description | Created By ------------- Veronica Morfi (1), Dan Stowell (1) and Hanna Pamula (2). (1): Machine Listening Lab, Centre for Digital Music (C4DM), Queen Mary University of London (QMUL), UK (2): AGH University of Science and Technology, Department of Mechanics and Vibroacoustics, Kraków, Poland Description -------------- The zip file contains 674 individual recording temporal annotations for the training set of the NIPS4B 2013 dataset in the birdsong classifications task (original size of dataset is 687 recordings). |
Type Of Material | Database/Collection of data |
Year Produced | 2018 |
Provided To Others? | Yes |
Impact | NIPS4Bplus: a richly annotated birdsong audio dataset (Morfi et al 2018) https://arxiv.org/abs/1811.02275 |
URL | https://figshare.com/articles/Transcriptions_of_NIPS4B_2013_Bird_Challenge_Training_Dataset/6798548 |
Title | Zebra finch group calling zf4f |
Description | Calling patterns of four female zebra finches, recorded in an indoor aviary in London in 2015. Two CSV files are included: each one describes a one-hour session. The CSV columns are: time (seconds), duration (seconds), microphone number (0-3). Note that the order of birds is swapped between the two days, so microphone number 0,1,2,3 in session2 corresponds to the same birds as microphone number 3,2,1,0 in session3. (We refer to this dataset as "zf4f".) |
Type Of Material | Database/Collection of data |
Year Produced | 2016 |
Provided To Others? | Yes |
Impact | Detailed temporal structure of communication networks in groups of songbirds Dan Stowell, Lisa Gill, and David Clayton Published:01 June 2016 https://doi.org/10.1098/rsif.2016.0296 |
URL | https://figshare.com/articles/Zebra_finch_group_calling_zf4f/1613791 |
Title | freefield1010bird |
Description | Annotated dataset of audio clips from the "freefield1010" collection, sourced from Freesound. Annotations indicate the presence/absence of birds. |
Type Of Material | Database/Collection of data |
Year Produced | 2016 |
Provided To Others? | Yes |
Impact | Facilitated the "Bird Audio Detection challenge", which directly stimulated 30 research teams from around the world to develop and test new algorithms for detecting bird sounds. |
URL | http://machine-listening.eecs.qmul.ac.uk/bird-audio-detection-challenge/ |
Title | warblrb10k |
Description | Annotated dataset of audio clips from the Warblr birdsong app. Annotations indicate the presence/absence of birds. |
Type Of Material | Database/Collection of data |
Year Produced | 2016 |
Provided To Others? | Yes |
Impact | Facilitated the "Bird Audio Detection challenge", which directly stimulated 30 research teams from around the world to develop and test new algorithms for detecting bird sounds. |
URL | http://machine-listening.eecs.qmul.ac.uk/bird-audio-detection-challenge/ |
Description | Automatic acoustic identification of individuals |
Organisation | Adam Mickiewicz University in Poznan |
Country | Poland |
Sector | Academic/University |
PI Contribution | Automatic acoustic identification of individuals - classification methods |
Collaborator Contribution | Automatic acoustic identification of individuals - bird expertise and data |
Impact | https://arxiv.org/abs/1810.09273 |
Start Year | 2016 |
Description | Automatic acoustic identification of individuals |
Organisation | Charles University |
Country | Czech Republic |
Sector | Academic/University |
PI Contribution | Automatic acoustic identification of individuals - classification methods |
Collaborator Contribution | Automatic acoustic identification of individuals - bird expertise and data |
Impact | https://arxiv.org/abs/1810.09273 |
Start Year | 2016 |
Description | Clayton Lab zebra finch recordings |
Organisation | Queen Mary University of London |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | Provided recording equipment, my time for running the recording sessions, and paid annotator time to annotate the data. |
Collaborator Contribution | Provided access to zebra finch facility, advice on study design, and practical support in setting up the recording sessions. |
Impact | One journal publication, introducing and evaluating a new method to animal communication analysis. One conference paper, on a method for detecting overlapping audio events. Dataset of audio recordings and annotations. |
Start Year | 2014 |
Description | MPIO Seewiesen |
Organisation | Max Planck Society |
Department | Max Planck Institute for Ornithology |
Country | Germany |
Sector | Academic/University |
PI Contribution | Data, collaboration time, hosting research visit, and novel analysis methodology |
Collaborator Contribution | Data, collaboration time, hosting research visit. |
Impact | 2 journal articles and 1 peer-reviewed conference paper. Collaboration is multi-disciplinary, across computer science and animal behaviour / ornithology. |
Start Year | 2015 |
Title | Warblr |
Description | Warblr is an app that automatically recognises birds by their song. The way in which it works is simple - make a recording with your iPhone, and Warblr will identify the species of bird/s that can be found in that recording, providing you with images and descriptions to help you learn more about our feathered friends. |
Type Of Technology | Webtool/Application |
Year Produced | 2015 |
Impact | In using Warblr, you will be contributing to a citizen science project, and hopefully one day helping to protect endangered species. We are making the recordings and the data collected on species identification freely available to be used for research and conservation. |
URL | http://warblr.net/ |
Company Name | Warblr |
Description | Warblr develops a mobile app to recognise bird species based on sound recording and provide educational content on birds. |
Year Established | 2013 |
Impact | In using Warblr, you will be contributing to a citizen science project, and hopefully one day helping to protect endangered species. We are making the recordings and the data collected on species identification freely available to be used for research and conservation. |
Website | http://www.warblr.co.uk |
Description | "Auditory illusions" BBC Radio 4 documentary |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Media (as a channel to the public) |
Results and Impact | BBC Radio 4: lead interviewee on "Auditory illusions" BBC Radio 4 documentary (with Prof Trevor Cox) (Sep 2019) |
Year(s) Of Engagement Activity | 2019 |
URL | https://www.bbc.co.uk/programmes/m00082dr |
Description | Animal Diplomacy Bureau at Soundcamp 2018 |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Local |
Primary Audience | Public/other audiences |
Results and Impact | This was Animal Diplomacy Bureau, a game created by Kaylene Kau. Since my research fellowship is all about birds and sounds and how they interact, and the game is all about exploring those topics, I was really pleased to commission Kaylene to develop an expanded version of the game and show it as part of SoundCamp 2018. The game consisted of participants taking on the role of either a parakeet, a goldfinch or a sparrowhawk while searching for red discs representing berries. The food location was indicated by recorded goldfinch calls heard nearby. The sparrowhawks had to wear small loudspeakers, which would play back sparrowhawk alarm calls giving the prey a chance to react. The prey players could resist the sparrowhawks by getting together and "mobbing" the bird, which is what many birds do in real life. After the game, Kaylene hosted discussions where she talked about how the bird species they'd been playing interact in real life, and how living in the city affects them. She invited participants to discuss what cities would be like if they were designed for animals as well as for humans. |
Year(s) Of Engagement Activity | 2018 |
URL | http://machine-listening.eecs.qmul.ac.uk/2018/05/soundcamp-2018-animal-diplomacy-bureau/ |
Description | BBC Radio 4 - Costing The Earth - Acoustic Ecology |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Media (as a channel to the public) |
Results and Impact | 2016-03 BBC Radio 4 "Costing The Earth" programme, with a feature interview with me about our "Warblr" birdsong app, and sound recognition |
Year(s) Of Engagement Activity | 2016 |
URL | http://www.bbc.co.uk/programmes/b071tgby |
Description | Live on BBC Radio 4 Today programme; also interviewed on BBC World Service "World Update" |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | I was interviewed live on BBC Radio 4 Today programme, at a prime-time 6:48 morning interview slot. I spoke about birdsong, technology, and my research work. On the same BBC visit I prerecorded an interview with BBC World Service "World Update". These radio programmes have a combined audience of many millions worldwide. |
Year(s) Of Engagement Activity | 2017 |
URL | http://www.bbc.co.uk/programmes/b08j997h |
Description | Media coverage of Warblr app kickstarter and launch |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | While launching the Warblr app, which is both a spinout company and a citizen-science data gathering initiative, we were featured in many national and international press outlets, through TV/radio/newspaper interviews as well as second-hand coverage. Warblr has been featured across the BBC (including 5 Live, BBC London News, Radio 4, and the BBC News homepage), on Sky News, in print and online through newspapers such as The Telegraph, The Times, The Sun, The Guardian, The Metro and The Daily Mail, and in publications such as Stylist, Shortlist, Wired, Stuff.TV and Engadget. The project has gained support from the likes of Stephen Fry, Chris Packham, the Urban Birder and the Royal Society of Arts (RSA), as well as thousands of tweets, posts, mentions and shares across social media, blogs and forums. The Warblr team have also spoken at conferences including Digital Shoreditch, London National Park, UnLtd Living It Festival, and Stylist Live. Warblr has won a Queen Mary University of London Public Engagement Award, and was shortlisted for the TechCityinsider's TechCities awards and the IAB (Interactive Advertising Bureau) Creative Showcase. Warblr is one of TechRadar's "Best iPhone apps of 2015" and The Next Web's "Apps of the year". |
Year(s) Of Engagement Activity | 2015,2016 |
URL | http://warblr.net/ |
Description | Media coverage of automatic bird classification results (Summer 2014) |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Issued press release, participated in media interviews (live on-air discussion on Irish national radio RTE1; print interviews for BBC and Science Magazine); plus secondary press resulting from those. Science Computer becomes a bird enthusiast By Kelly Servick http://news.sciencemag.org/plants-animals/2014/07/computer-becomes-bird-enthusiast Tweeted at least 40 times BBC Software can decode bird songs By Claire Marshall BBC environment correspondent http://www.bbc.co.uk/news/science-environment-28358123 Tweeted at least 519 times The Sun A little bird told me.. Europa Press Un 'shazam' para identificar qué pájaro está cantando http://www.europapress.es/ciencia/laboratorio/noticia-shazam-identificar-pajaro-cantando-20140717173434.html Tweeted at least 105 times El Economista Un 'shazam' para identificar qué pájaro está cantando http://ecodiario.eleconomista.es/ciencia/noticias/5949497/07/14/Un-shazam-para-identificar-que-pajaro-esta-cantando.html Tweeted at least 22 times Live Science 'Voice Recognition' System for Birds Can Tell Two Chirps Apart http://www.livescience.com/46840-bird-songs-decoded.html Tweeted at least 26 times Science 2.0 Birdsongs Decoded http://www.science20.com/news_articles/birdsongs_decoded-140731 Tweeted at least 2 times I have received various email/twitter contacts, both from members of the public and from academics/industry enquiring about the state of the art and possible future deployments, spinouts, etc. |
Year(s) Of Engagement Activity | 2014 |
URL | http://www.bbc.co.uk/news/science-environment-28358123 |
Description | Public bird detection event at Drapers Field, Stratford |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Public/other audiences |
Results and Impact | Ran a stall in a local community area as part of a "Birdbox" Sunday event for families and others: engaged with 60-100 passersby, took 5 on a bird walk around Stratford, discussed bird sounds and technology. |
Year(s) Of Engagement Activity | 2017 |
Description | PyData 2018 |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Industry/Business |
Results and Impact | Talk at "PyData" event, i.e. to an audience of data scientists and Python software developers. |
Year(s) Of Engagement Activity | 2018 |
URL | https://www.youtube.com/watch?v=pzmdOETnhI0 |
Description | Science Magazine Warblr coverage |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | Media coverage of our bird recognition work, in the influential Science magazine |
Year(s) Of Engagement Activity | 2018 |
URL | http://www.sciencemag.org/news/2018/07/watch-out-birders-artificial-intelligence-has-learned-spot-bi... |
Description | The Conversation article |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Wrote article published in The Conversation long-form news website. Received over 100 Twitter shares, over 300 Facebook shares, and more: https://theconversation.com/we-made-an-app-to-identify-bird-sounds-and-learned-something-surprising-about-people-65742 |
Year(s) Of Engagement Activity | 2016 |
URL | https://theconversation.com/we-made-an-app-to-identify-bird-sounds-and-learned-something-surprising-... |
Description | The Conversation article |
Form Of Engagement Activity | A magazine, newsletter or online publication |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | Article "We made an app to identify bird sounds - and learned something surprising about people" aimed for general public, based on sound recognition work |
Year(s) Of Engagement Activity | 2016 |
URL | https://theconversation.com/we-made-an-app-to-identify-bird-sounds-and-learned-something-surprising-... |
Description | interviewed in Nature article "AI empowers conservation biology" |
Form Of Engagement Activity | A press release, press conference or response to a media enquiry/interview |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Media (as a channel to the public) |
Results and Impact | Interviewed in Nature article "AI empowers conservation biology", very high impact scientific magazine. "AI empowers conservation biology" by Roberta Kwok 4th March 2019 |
Year(s) Of Engagement Activity | 2019 |
URL | https://www.nature.com/articles/d41586-019-00746-1 |