Analysing Videos of Fish in the Field

Lead Research Organisation: University of East Anglia
Department Name: Computing Sciences

Abstract

Colour & Imaging Lab at UEA has been involved in research involving automatic analysis of videos captured on fishing vessels equipped with Catch Quota Monitoring Systems (CQMS). A view of the conveyor belt bringing fish to the sorting and discard areas is critical to any CQMS. Manually reviewing the video footage to classify, count and measure the catch is a tedious and costly task that represents a bottleneck in CQMS. Our past and current research focus is on developing computer vision algorithms that would automate this process.

In this project we propose to take this research a step further. In the above described scenario, although fish are imaged in some very challenging environment, the fixed geometry of the belt, the fixed camera position and a relatively stable indoor lighting provide constraints that can be exploited in subsequent video processing. Here, we are proposing to analyse videos captured in significantly less constrained environments. A typical scenario would involve video analyses of fish in an outdoor environment of a port, a market, or a sorting table on board a small fishing vessel. In contrast to the previous scenario where fixed and known cameras are available, here videos would often need to be captured by handheld devices, typically mobile phones. One of the major challenges that would need to be faced is a varying illumination. In order to succeed in this project, we will build on the most recent research in areas including depth sensing, photogrammetry, image fusion and machine (deep) learning.

Successful deep learning algorithms often require large datasets of images with ground-truth annotations - here, these would be fish species as well as weight and length annotations. These requirements are often so prohibitive, that development of these algorithms must be initiated using synthetic data, and later improved using captured data annotated by human experts. In this project, we are planning to exploit this approach.

The student will also look at other scenarios of fish video analysis as the need arises e.g. systems that could be installed in fish slot passes or similar depending on industrial sponsor interests.

The student will be part of the Colour & Imaging Lab at the School of Computing Sciences, UEA. The Colour & Imaging Lab comprises three faculty members and approximately a dozen researchers and PhD students. Notably, it has been involved in numerous industry collaborations also involving the Agritech sector and has produced two successful spin-out companies.

UEA and the project industrial partner Cefas are currently involved in a related project - H2020 Smartfish project (smartfishh2020.eu). In this project, a large collection of data have been and is being collected. This data will be available to the student which will minimise risks related to the data collection stage of the project.

Planned Impact

The proposed CDT provides a unique vision of advanced RAS technologies embedded throughout the food supply chain, training the next generation of specialists and leaders in agri-food robotics and providing the underpinning research for the next generation of food production systems. These systems in turn will support the sustainable intensification of food production, the national agri-food industry, the environment, food quality and health.

RAS technologies are transforming global industries, creating new business opportunities and driving productivity across multiple sectors. The Agri-Food sector is the largest manufacturing sector of the UK and global economy. The UK food chain has a GVA of £108bn and employs 3.6m people. It is fundamentally challenged by global population growth, demographic changes, political pressures affecting migration and environmental impacts. In addition, agriculture has the lowest productivity of all industrial sectors (ONS, 2017). However, many RAS technologies are in their infancy - developing them within the agri-food sector will deliver impact but also provide a challenging environment that will significantly push the state of art in the underpinning RAS science. Although the opportunity for RAS is widely acknowledged, a shortage of trained engineers and specialists has limited the delivery of impact. This directly addresses this need and will produce the largest global cohort of RAS specialists in Agri-Food.

The impacts are multiple and include;

1) Impact on RAS technology. The Agri-Food sector provides an ideal test bed to develop multiple technologies that will have application in many industrial sectors and research domains. These include new approaches to autonomy and navigation in field environments; complex picking, grasping and manipulation; and novel applications of machine learning and AI in critical and essential sectors of the world economy.

2) Economic Impact. In the UK alone the Made Smarter Review (2017) estimates that automation and RAS will create £183bn of GVA over the next decade, £58bn of which from increased technology exports and reshoring of manufacturing. Expected impacts within Agri-Food are demonstrated by the £3.0M of industry support including the world largest agricultural engineering company (John Deere), the multinational Syngenta, one of the world's largest robotics manufacturers (ABB), the UK's largest farming company owned by James Dyson (one of the largest private investors in robotics), the UK's largest salads and fruit producer plus multiple SME RAS companies. These partners recognise the potential and need for RAS (see NFU and IAgrE Letters of Support).

3) Societal impact. Following the EU referendum, there is significant uncertainty that seasonal labour employed in the sector will be available going forwards, while the demographics of an aging population further limits the supply of manual labour. We see robotic automation as a means of performing onerous and difficult jobs in adverse environments, while advancing the UK skills base, enabling human jobs to move up the value chain and attracting skilled workers and graduates to Agri-Food.

4) Diversity impact. Gender under-representation is also a concern across the computer science, engineering and technology sectors, with only 15% of undergraduates being female. Through engagement with the EPSRC ASPIRE (Advanced Strategic Platform for Inclusive Research Environments) programme, AgriFoRwArdS will become an exemplar CDT with an EDI impact framework that is transferable to other CDTs.

5) Environmental Impact. The Agri-food sector uses 13% of UK carbon emissions and 70% of fresh water, while diffuse pollution from fertilisers and pesticides creates environmental damage. RAS technology, such as robotic weeders and field robots with advanced sensors, will enable a paradigm shift in precision agriculture that will sustainably intensify production while minimising environmental impacts.

Publications

10 25 50

Studentship Projects

Project Reference Relationship Related To Start End Student Name
EP/S023917/1 01/04/2019 30/09/2031
2458395 Studentship EP/S023917/1 01/10/2020 30/09/2024 Mazvydas Gudelis