ACCENT: ACoustic Control of ENTomogical pests

Lead Research Organisation: Kingston University
Department Name: F: Engineering Computing & Environment

Abstract

This project will develop signal processing and machine-learning based methods to detect flying insect pest species, as well as their predators, as they are recorded while entering traps. It will be developed in the laboratory based on recorded signals provided by the industrial partners and can then be employed in smart traps being developed by the latter. The work is important in helping to monitor insect pest attacks in crops of agricultural importance, as well as informing mitigation strategies that can involve the deployment of biological methods such as predatory insects and thereby reduce the use of insecticides leading to many positive environmental and economic benefits.

The research team will develop methods based on detecting the main tones and overtones that are present in the sound of the beating insect wings, which are highly species-dependent and therefore can serve as an excellent marker for identifying target species. Part of the challenge to be overcome is to detect these tones and overtones against the background of other ambient noises, which can include bird calls, agricultural machinery and vehicles, rain and other sounds arising from the weather, and even occasionally conversations. This will be achieved by dividing the recorded sounds into short time chunks of approximately 1 second (around the length of time the sound of the insect is heard as it flies into the trap), and then adopting a two-fold approach. One strand will use classical filtering methods to pick out the tones and overtones in each chunk, and develop methods based on the relative contributions of these components to the overall sound level. The other will be based on presenting the recorded signals to Artificial Intelligence (AI) systems that utilise machine learning to distinguish samples that contain the target signals. Part of the challenge here will be to 'tune' the variables that control the learning of the the AI systems through 'training' so that the highest successful detection rates are achieved, while simultaneously minimizing false detections.

The industrial partner Agrisound Ltd will help by providing samples of field recordings as well as laboratory recordings of target insect species. They will advise on the selection of target species, as well as regarding future incorporation and deployment of these methods in the smart traps that they are currently developing. The team at Kingston University London (KUL) will develop the methods for signal analysis and target species detection, and will liaise with Agrisound in detail regarding this.

Although not forming part of the current application, Agrisound has a wide range of links with growers and end users, that can enable real-world deployment and utilisation of the technology being developed.

Technical Summary

This project will develop signal processing and machine-learning based methods to detect flying insect pest species and their predators, as they are recorded while entering traps. Laboratory development will be based on recorded signals provided by Agrisound and can then be employed in smart traps they are currently developing. The work is important in helping to monitor insect pest attacks on agriculturally important crops, and can inform biological control strategies such as use of predatory insects, and more timely intervention, thereby reducing the use of insecticides leading to many positive environmental and economic benefits.

The research team will develop methods based on detecting the the fundamental and harmonics in the frequency spectrum of the beating insect wings. These are highly species-dependent and therefore can serve as excellent markers for identifying target species. Part of the challenge is to distinguish these in the spectrum with a time-varying background of other ambient noises. This will be achieved by dividing the recordings into samples of approximately 1 sec (around the duration of the sound as the insect flies into the trap), and then adopting a two-fold approach. One strand will optimise filtering methods to pick out the required features in each sample, and develop methods based on the relative peak positions and sizes in the filtered spectrum. The other will be use deep learning systems based on LSTM networks to distinguish samples containing the target signals. The training and learning rates will be optimised.

The industrial partner Agrisound Ltd will provide field recording samples and laboratory recordings of target insect species. They will advise on the selection of target species, as well as regarding future incorporation/deployment of these methods in their smart traps. The team at Kingston University London will develop signal processing and target species detection techniques.

Publications

10 25 50