EPIC: An automated diagnostic tool for Potato Late Blight disease detection from images

Lead Research Organisation: Manchester Metropolitan University
Department Name: Sch of Computing, Maths and Digital Tech

Abstract

The yields of crop plants are deleteriously affected by various diseases. It is estimated that almost 25% of worldwide crops are lost to diseases, which may cause devastating economical, social and ecological losses. In China, as the fourth important food crop, yield losses from potato late blight diseases can vary from 20%-40% in common years. In severe cases, the yield loss may reach 50%-100%. The estimated yearly economic losses due to this disease are around $5 billion in China. Early accurate detection and identification of crop diseases plays an important role in effectively controlling and preventing diseases for sustainable agriculture and food security.

In our previous funded projects, we have developed an innovative automated machine vision system for efficient crop disease diagnosis from images, which have proven the technical feasibility of using advanced image processing, machine learning, mobile and cloud computing approaches.

This project will take it forward and develop a near-market product ready for commercialisation, which can provide more accurate real-time information for crop disease surveillance. The tool can run on mobile devices. Farmers with basic training can perform disease diagnosis immediately. Compared to the current practices using human visual observation (which is labour intensive, costly and error-prone), this machine vision system can dramatically speed up diagnosis, and give growers more accurate information on which to base their disease control strategies and stop crop yields from being reduced by infection. This technology can overcome lack of expertise, help make a significant impact on agricultural productivity and farmer incomes, ensuring food security, and deliver highly cost-effective, long-term economic and social impact in China.

To achieve actual impact and demonstrable benefits in China, this project will work closely with Chinese partners from academia, industry and farmers including: the project partner (Hebei Agriculture University (HEBAU)), and end users (Beijing Mengbangda Biotechnology Co. Ltd (BMB) and Guyuan County Potato Association (GCPA)). They will provide support in gathering the field data, setting up trial systems and domain knowledge input from plant pathologists for local agriculture and fine tuning the systems in the fields, as well as potential commercialization of this technology in China (Hebei province initially). The project focuses on three stages of translational/user engagement:
1) System requirement gathering from users;
2) System evaluation with input from users;
3) Potential impact and commercial exploitation with end users.

The tool will be initially deployed in the real fields provided by end users (BMB and GCPA) in Hebei province at the end of project. This will help protect potato-planting area of over 380k MU initially from the disease infection with reduced annual costs on fungicide usage and damages to environment.

Other translational activities include organization of workshops, conference attendance and paper publications, which will be used for dissemination and engagement with a wide range of user groups on a large-scale.

This project will not only develop a near-market product but will also generate a significantly measurable impact, promote long-term sustainable growth, economic development and welfare in China and beyond.

Technical Summary

This project seeks to exploit our previous funded work, develop a near-market, automatic diagnostic machine vision tool for accurate detection of crop diseases using advanced image analysis and cloud/mobile computing approaches and deliver benefits and impact in China, with an initial focus on potato late blight (the most sever potato disease in China and worldwide). Our cloud-based machine vision system can rapidly identify the disease based on smartphone images. The farmers with only basic training can perform diagnosis immediately, with no need for experts based in the fields. In collaboration with our project partner Hebei Agriculture University (HEBAU) and end users (Beijing Mengbangda Biotechnology Co. Ltd (BMB) and Guyuan County Potato Association (GCPA)), the tool will be initially deployed in Hebei Province protecting over 380,000MU planting area. This will help make a significant impact on agricultural productivity and farmer incomes, and generate robust social and commercial impact in China.

Planned Impact

N/A according to the call document
 
Description This project addresses food security and agricultural productivity. The project has developed a near-market, automatic diagnostic machine vision tool for real-time accurate detection of crop diseases using advanced image pattern recognition/AI and cloud/mobile computing approaches and delivers benefits and impact on agricultural productivity and farmer incomes, and generate robust social and commercial impact in China (initial focus on potato late blight - the most sever potato disease in China, and the worldwide).

Thanks to GCRF,
• For the first time, we developed a new near-market machine vision app which can perform on-device real-time diagnosis of potato diseases with high accuracy. Our product has been tested by our Chinese project partners in the real fields during the project period on a small scale and will be tested and evaluated on a large scale in the coming growing seasons. Once it is tested on a large scale, we will relase our mobile app to allow all farmers to generate a wide impact.

• We also held two large workshops on "Precision Agriculture: Data Driven Approach to Crop Monitoring and Disease Diagnosis" in China. The workshops brought together a total of 44 attendees including researchers and practitioners from both industry and academia. We have extended our network and build a number of new contacts.

• Through this project, we have established the further collaboration with Hebei Agriculture University, Chinese Acadmy Science and also LangfangDahuaxia co Ltd and won an international collaborative grant (BBSRC funded project-UK-China Agritech Challenge, BB/S020969/1)
Exploitation Route The project aims to translate our previously funded innovative research into practice by developing the automatic crop disease diagnosis based on big data analytics/machine learning/AI/Image processing/parallel processing/cloud computing. We have developed a near-market product (mobile app, which can provide accurate and immediate diagnosis in real time without needing internet connection. Farmers can use it with only basic training. Please refer to our project website for more detail (https://epic337255300.wordpress.com/). Our product has been tested by our project partners in China during the project period on a small scale and will be tested and evaluated on a large scale by our project partners in the coming growing seasons. Once it is tested on a large scale, we will release our mobile app to allow all farmers to use our product. We will continuously work with farmers, companies to conduct field testing, evaluation and exploit potential commercialization and deliver big impact in China. This product is now being tested by farmers in the real fields.
Sectors Agriculture, Food and Drink,Communities and Social Services/Policy,Digital/Communication/Information Technologies (including Software),Education,Healthcare,Manufacturing, including Industrial Biotechology,Transport

URL https://epic337255300.wordpress.com/
 
Description The output of this project has been trialled by the real users (farmers). We also held a smart farming technology training workshop. Our outputs/findings have been used for skill training and capacity building (attracted over 120 attendees)
First Year Of Impact 2021
Sector Agriculture, Food and Drink,Digital/Communication/Information Technologies (including Software)
Impact Types Societal

 
Description Smart farming technology training workshop
Geographic Reach Multiple continents/international 
Policy Influence Type Influenced training of practitioners or researchers
 
Description Feasibility study for the development of an innovative Smart-farming app for smallholders in developing countries. (Agriculture-Productivity-Project) 'APP'
Amount £387,222 (GBP)
Funding ID 134039 
Organisation Innovate UK 
Sector Public
Country United Kingdom
Start 07/2020 
End 01/2022
 
Description UK-China Agritech Challenge: CropDoc - Precision Crop Disease Management for Farm Productivity and Food Security
Amount £449,193 (GBP)
Funding ID BB/S020969/1 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 02/2019 
End 02/2022
 
Title A Biologically Interpretable Two-Stage Deep Neural Network (BIT-DNN) for Vegetation Recognition From Hyperspectral Imagery 
Description Spectral-spatial-based deep learning models have recently proven to be effective in hyper-spectral image (HSI) classification for various earth monitoring applications such as land cover classification and agricultural monitoring. However, due to the nature of ``black-box'' model representation, how to explain and interpret the learning process and the model decision, especially for vegetation classification, remains an open challenge. This study proposes a novel interpretable deep learning model--a biologically interpretable two-stage deep neural network (BIT-DNN), by incorporating the prior-knowledge (i.e., biophysical and biochemical attributes and their hierarchical structures of target entities)-based spectral-spatial feature transformation into the proposed framework, capable of achieving both high accuracy and interpretability on HSI-based classification tasks. The proposed model introduces a two-stage feature learning process: in the first stage, an enhanced interpretable feature block extracts the low-level spectral features associated with the biophysical and biochemical attributes of target entities; and in the second stage, an interpretable capsule block extracts and encapsulates the high-level joint spectral-spatial features representing the hierarchical structure of biophysical and biochemical attributes of these target entities, which provides the model an improved performance on classification and intrinsic interpretability with reduced computational complexity. We have tested and evaluated the model using four real HSI data sets for four separate tasks (i.e., plant species classification, land cover classification, urban scene recognition, and crop disease recognition tasks). The proposed model has been compared with five state-of-the-art deep learning models. The results demonstrate that the proposed model has competitive advantages in terms of both classification accuracy and model interpretability, especially for vegetation classification. 
Type Of Material Technology assay or reagent 
Year Produced 2021 
Provided To Others? Yes  
Impact the project is still ongoing and will describe the impact in the later stage 
URL https://ieeexplore.ieee.org/document/9362293
 
Title Novel CropdocNet Model for Automated Potato Late Blight Disease Detection from Unmanned Aerial Vehicle-Based Hyperspectral Imagery 
Description This study proposes a novel end-to-end deep learning model (CropdocNet) for accurate and automated late blight disease diagnosis from UAV-based hyperspectral imagery. The proposed method considers the potential disease-specific reflectance radiation variance caused by the canopy's structural diversity and introduces multiple capsule layers to model the part-to-whole relationship between spectral-spatial features and the target classes to represent the rotation invariance of the target classes in the feature space. We evaluate the proposed method with real UAV-based HSI data under controlled and natural field conditions. The effectiveness of the hierarchical features is quantitatively assessed and compared with the existing representative machine learning/deep learning methods on both testing and independent datasets. The experimental results show that the proposed model significantly improves accuracy when considering the hierarchical structure of spectral-spatial features, with average accuracies of 98.09% for the testing dataset and 95.75% for the independent dataset, respectively 
Type Of Material Technology assay or reagent 
Year Produced 2022 
Provided To Others? Yes  
Impact ongoing 
URL https://www.mdpi.com/2072-4292/14/2/396
 
Title a near-market mobile machine vision tool for crop disease detection 
Description We have developed an intuitive mobile first application capable of performing immediate real time on-device disease detection using advance machine learning techniques/deep learning without an internet connection and providing treatment options for controlling the disease. Farmers with little to no training will be able to use the application to perform disease diagnosis easily and rapidly in their own fields. This machine vision system can dramatically speed up diagnosis, give growers more accurate information on which to base their disease control strategies and stop crop yields from being reduced by infection as well as improving their economic gains by allowing more targeted controls, reducing the amount of chemical sprays required. Our artificial intelligence models, currently capable of detecting three major potato diseases (Late and Early Blight, and June Disease), are constantly refined by using new images acquired from our users ensuring ongoing accuracy near-market mobile machine vision tool to provide rapid, on-device, expert-level disease detection on mobile devices without access to the internet. We had tested and evaluated in the real fields in 2020. 
Type Of Material Technology assay or reagent 
Year Produced 2018 
Provided To Others? Yes  
Impact For the first time, we developed a new near-market machine vision app which can perform on-device real-time diagnosis of potato diseases with high accuracy. Our product has been tested by our Chinese project partners in the real fields during the project period on a small scale and will be tested and evaluated on a large scale in the coming growing seasons. Once it is tested on a large scale, we will relase our mobile app to allow all farmers to generate a wide impact. 
URL https://epic337255300.wordpress.com
 
Title A Biologically Interpretable Two-stage Deep Neural Network (BIT-DNN) For Vegetation Recognition From Hyperspectral Imagery 
Description Spectral-spatial-based deep learning models have recently proven to be effective in hyper-spectral image (HSI) classification for various earth monitoring applications such as land cover classification and agricultural monitoring. However, due to the nature of ``black-box'' model representation, how to explain and interpret the learning process and the model decision, especially for vegetation classification, remains an open challenge. This study proposes a novel interpretable deep learning model--a biologically interpretable two-stage deep neural network (BIT-DNN), by incorporating the prior-knowledge (i.e., biophysical and biochemical attributes and their hierarchical structures of target entities)-based spectral-spatial feature transformation into the proposed framework, capable of achieving both high accuracy and interpretability on HSI-based classification tasks. The proposed model introduces a two-stage feature learning process: in the first stage, an enhanced interpretable feature block extracts the low-level spectral features associated with the biophysical and biochemical attributes of target entities; and in the second stage, an interpretable capsule block extracts and encapsulates the high-level joint spectral-spatial features representing the hierarchical structure of biophysical and biochemical attributes of these target entities, which provides the model an improved performance on classification and intrinsic interpretability with reduced computational complexity. We have tested and evaluated the model using four real HSI data sets for four separate tasks (i.e., plant species classification, land cover classification, urban scene recognition, and crop disease recognition tasks). The proposed model has been compared with five state-of-the-art deep learning models. The results demonstrate that the proposed model has competitive advantages in terms of both classification accuracy and model interpretability, especially for vegetation classification. 
Type Of Material Computer model/algorithm 
Year Produced 2021 
Provided To Others? Yes  
Impact this project is still ongoing 
URL https://ieeexplore.ieee.org/document/9362293
 
Title a novel end-to-end deep learning model (CropdocNet) 
Description a novel end-to-end deep learning model (CropdocNet) for accurate and automated late blight disease diagnosis from UAV-based hyperspectral imagery. The proposed method considers the potential disease-specific reflectance radiation variance caused by the canopy's structural diversity and introduces multiple capsule layers to model the part-to-whole relationship between spectral-spatial features and the target classes to represent the rotation invariance of the target classes in the feature space. We evaluate the proposed method with real UAV-based HSI data under controlled and natural field conditions. The effectiveness of the hierarchical features is quantitatively assessed and compared with the existing representative machine learning/deep learning methods on both testing and independent datasets. The experimental results show that the proposed model significantly improves accuracy when considering the hierarchical structure of spectral-spatial features, with average accuracies of 98.09% for the testing dataset and 95.75% for the independent dataset, respectively. 
Type Of Material Data analysis technique 
Year Produced 2022 
Provided To Others? Yes  
Impact ongoing 
URL https://www.mdpi.com/2072-4292/14/2/396
 
Description Academic partnership 
Organisation Agricultural University of Hebei
Country China 
Sector Academic/University 
PI Contribution we had developed a near-market mobile app for potato disease detection to provide rapid, on-device, expert-level disease detection on mobile devices without access to the internet, delivering impact and benefits in China.
Collaborator Contribution field data collection, testing and evaluation
Impact please refer to other sections for details. The collaboration is multi-disciplinary.
Start Year 2018
 
Description Academic partnership 
Organisation University of Chinese Academy of Sciences
Department Institute of Remote Sensing and Digital Earth
Country China 
Sector Charity/Non Profit 
PI Contribution we have worked together to get a research collaborative project.
Collaborator Contribution we have worked together to get a research collaborative project.
Impact still active
Start Year 2019
 
Title a mobile-based machine vision tool 
Description We have developed a mobile-based machine vision tool, which can automatically identify potato late blight disease in real time. It is on-device detection without needing the Internet connection. users with only basic training can use it easily. Currently, this product is tested in real fields. 
Type Of Technology Software 
Year Produced 2019 
Impact For the first time, we developed a new near-market machine vision app which can perform on-device real-time diagnosis of potato diseases with high accuracy. Our product has been tested by our Chinese project partners in the real fields during the project period on a small scale and will be tested and evaluated on a large scale in the coming growing seasons. Once it is tested on a large scale, we will relase our mobile app to allow all farmers to generate a wide impact. 
URL https://epic337255300.wordpress.com/our-product/
 
Title a novel end-to-end deep learning model (CropdocNet) for accurate and automated late blight disease diagnosis from UAV-based hyperspectral imagery 
Description a novel end-to-end deep learning model (CropdocNet)/software for accurate and automated late blight disease diagnosis from UAV-based hyperspectral imagery 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2021 
Impact ongoing 
 
Description An invited article published in New Food Magazine: "Could AI ease food security fears?", New Food , 15/2/2021 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact Prof. Liangxiu Han has an invited article published in New Food Magazine: "Could AI ease food security fears?", New Food , 15/2/2021 https://www.newfoodmagazine.com/article/138114/ai-and-food-security/
Year(s) Of Engagement Activity 2021
URL https://www.newfoodmagazine.com/article/138114/ai-and-food-security/
 
Description An invited talk of "Precision Agriculture: A Big Data Driven, AI enabled Approach to Crop Disease Diagnosis" 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Liangxiu Han was invited to give an invited talk of "Precision Agriculture: A Big Data Driven, AI enabled Approach to Crop Disease Diagnosis" on The 2nd Crop Pest and Diseases (P&D) Remote Sensing Conference, China. 30 Aug
Year(s) Of Engagement Activity 2021
 
Description An invited talk on International workshop cum training programme titled "Green Growth Strategies for Climate Resilience and Disaster Risk Reduction (DRR): Policies, Pathways and Tools" 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Prof. Liangxiu Han was invited to give a talk on International workshop cum training programme titled "Green Growth Strategies for Climate Resilience and Disaster Risk Reduction (DRR): Policies, Pathways and Tools" from 26th to 28th November, 2020. Organised by the Centre for Ecological Economics and Natural Resources (CEENR), Institute for Social and Economic Change (ISEC), Bangalore and the National Institute of Disaster Management (NIDM), Ministry of Home Affairs, New Delhi, India. http://www.isec.ac.in/2_Final%20Flyer%20With%20Programme%20Schedule.pdf
Year(s) Of Engagement Activity 2020
URL http://www.isec.ac.in/2_Final%20Flyer%20With%20Programme%20Schedule.pdf
 
Description Article in food magazine: "Could AI ease food security fears?" 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Published an article in New Food Magazine:
"Could AI ease food security fears?", New food Magazine, 15/2/2021 https://www.newfoodmagazine.com/article/138114/ai-and-food-security/
Year(s) Of Engagement Activity 2021
URL https://www.newfoodmagazine.com/article/138114/ai-and-food-security/
 
Description British Science Week 2023 at Manchester Museum 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Schools
Results and Impact We will showcase our research work --AI for precision agriculture at the British Science week 2023 at Manchester Museum. There have 500-600 pupils attending the event.
Year(s) Of Engagement Activity 2023
 
Description Keynote speaker for IEEE PES Women in Power and IEEE Women in Engineering 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Keynote speaker, "Meeting Societal Challenges: Big Data Driven AI-enabled Approaches", The Power of Data: Data Science Meeting by IEEE PES Women in Power and IEEE Women in Engineering, On 23rd June 2021,
the IEEE PES Women in Power and IEEE Women in Engineering in UK & Ireland celebrated the International Women in Engineering Day with this engaging webinar. https://www.ieee-ukandireland.org/watch-again-the-power-of-data-data-science-meeting/,
Year(s) Of Engagement Activity 2021
 
Description The International Workshop on "Precision Agriculture: Data Driven Approach to Crop Monitoring and Disease Diagnosis" 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact The workshop brought together researchers and practitioners from academia and industry in computer science, plant science, and remote sensing. The main aim of the workshop has severalfold: 1) to show case the novel precision agriculture solution-A data-driven decision support platform for crop disease detection using multiple data sources (such as images from Satellite, Drone, Smartphone), developed Prof. Liangxiu Han and Her team. 2) to discuss and exchange ideas with the attendees including researchers from academia and practitioners from industry and understand the needs; 3) to establish long term collaboration with partners from industry and academia in China, exploit potential commercialization opportunities for crop disease management, and deliver highly cost-effective, long-term economic and social impact in China and beyond.

The workshop included two sessions: technical and network sessions:
• In the technical session, there were 12 invited speakers. PI (Prof. Han) and the collaborator (Prof. Hu) introduced the research strengths from both UK and China sides, paying particular attention to automatic crop disease diagnostic machine vision system using big data driven approaches by Manchester Metropolitan University in UK and China late blight DSS by Chinese partner. In addition to it, the other speakers from industry and academia gave talks in relation to crop disease and monitoring, the needs for data driven approaches, and how data-driven approaches can be applied to agriculture
• In the network session, we have discussed the potential funding opportunities and the potential commercialisation.

There were 27 participants representing different organisations from both companies and academia.
Companies:
1) SnowVally Agriculture group,
2) Sinochem Corporation,
3) Beijing Mengbangda Biotechnology Co. Ltd,
4) Hebei jiu'en agriculture co. Ltd
5) Langfang City Dahuaxia Shennong Information Technology Co., Ltd.
Research institutions
6) Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences,
7) National Engineering Research Center for Information Technology in Agriculture in China
8) Key Laboratory for Information Technologies in Agriculture, Ministry of Agriculture
9) Anhui Zhongke Intelligent Sense and Big Data Industrial Technology Research Institute.
The workshop was very successful and productive.
Year(s) Of Engagement Activity 2018,2019
URL https://epic337255300.wordpress.com/category/dissemination-activities/
 
Description a workshop on "Precision Agriculture: Data Driven Approach to Crop Monitoring and Disease Diagnosis" 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact The workshop brought together researchers and practitioners from both industry and academia with a background in plant science and computer science. There were 20 registration and the actual attendees are 17. The workshop included two sessions: technical and network sessions:
• In the technical session, there are 11 invited speakers. PI (Prof. Han) and the collaborator (Prof. Hu) introduced the research strengths from both UK and China sides, paying particular attention to automatic crop disease diagnostic machine vision system by Manchester Metropolitan University in UK and China late blight DSS by Chinese partner. In addition to it, the other speakers from industry and academia gave talks in relation to crop disease and monitoring and how data-driven approaches can be applied to agriculture.
• In the network session, we have discussed the potential funding opportunities and the potential proposal ideas.
The outcomes of this project 1) we have presented our advanced technologies in relation to automatic crop disease detection using image processing, machine learning and cloud computing and attracted a lots of attentions from industry and academia 2) Establishment of a new partnership with HEBAU in China and a network with the attendees from leading organisations in China in relation to data-driven approach for crop monitoring and disease diagnosis; 3) We are currently in discussion with Hebei Agriculture University for potential collaboration in student and staff exchange programme.
4) All above will generate and promote the long-term sustainable economic growth and welfare of China
Year(s) Of Engagement Activity 2018,2019
URL https://epic337255300.wordpress.com/category/dissemination-activities/
 
Description an invited Panel Speech "The Future of Digital Twins: The Role of Big Data/Artificial Intelligence", 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Prof. Liangxiu Han was invited to give a Panel Speech "The Future of Digital Twins: The Role of Big Data/Artificial Intelligence",   the 18th IEEE International Symposium on Parallel and Distributed Processing with Applications (ISPA-2020), the 10th IEEE International Conference on Big Data and Cloud Computing (BDCloud-2020), the 13th IEEE International Symposium on Social Computing and Networking (SocialCom-2020), and the 10th IEEE International Conference on Sustainable Computing and Communications (SustainCom-2020), 17-19 December 2020
Year(s) Of Engagement Activity 2020
URL https://hpcn.exeter.ac.uk/socialcom2020/