UK-China Agritech Challenge: CropDoc - Precision Crop Disease Management for Farm Productivity and Food Security

Lead Research Organisation: Manchester Metropolitan University
Department Name: Sch of Computing, Maths and Digital Tech

Abstract

CropDoc seeks to exploit existing research on Potato disease identification and outbreak management in the domain of precision agriculture, agriculture digitisation & decision management support. It will harness cutting-edge technologies (i.e. IoT, mobile devices, crowd sourced data, big data analytics and cloud computing). It will build a decision support system that generates insight from multiple data collected from remote sensing above the fields and IoT ground sensing within the fields for monitoring & prediction of disease in real time. CropDoc will base its data service and analytics platform on open standards and will allow interoperability through open APIs. This will ensure an end-platform ecosystem can emerge that consumes the data & analytics service, allowing farmers to use their platform of choice, while allowing central authorities to identify and manage sector outbreaks.

The initial focus will be on potato late blight disease, one of the most devastating crop diseases in China. In a typical blight pressure season crop protection chemicals cost the industry an estimated $10-20bn per annum. Late blight has been referred to as a 'community disease', due to its ability to spread rapidly from field to field under the right weather conditions. Asexual spores travel easily on the wind when the weather is cool and moist, and can rapidly infect neighbouring fields. As such, understanding the symptoms of the disease and what to do when it is detected are essential to preventing an outbreak from rapidly turning into an epidemic.

Planned Impact

The potential beneficiaries can be broadly classified into two groups including specific users and wider users.

1.1 Specific users
1) Our collaborators from both China and UK, Plant disease control and management agencies, pathologists, and farmers/homegrowers. They will benefit from the project by using our decision support system to assist diagnosis of plant diseases and outbreak management.
2) Researchers from Bioscience, Remote Sensing, and Computer Science. This project will advance new practices for such researchers by delivering an ICT-based solution using data-driven approaches (new algorithms and software). These new algorithms can be applied and adapted to other bioscience related problems (e.g. not just crop diseases but also other plant diseases). This project will complement and strengthen the current research activities in the area of Food Security and Traceability in both host and the partner institution and will contribute to the international standing of UK research in this area and beyond. Most importantly, the output of this project will provide a solid work to attract further funds from different sources (e.g. Horizon 2020).
3) Skilled researchers, fluent in crop disease detection, image processing, remote sensing, machine learning, big data processing and analytics will be trained by the unique interdisciplinary nature of this project which brings together researchers from Computer Science and Bioscience.
4) Research Councils and policy makers, who will be able to draw on the outcomes of the project to advise policy development and decision makers on appropriate strategies for future investment on enhancing plant disease management and improving food safety and traceability.
5) Industry practitioners/farm companies who provide solutions for plant disease control and prevention. Our system could be potentially commercialised and transformed to new products for disease diagnosis and prevention. Additionally, our platform is a standard, open platform, which can be easily plugged into existing plant disease control and management systems.
6) The users from education sector. Students and lecturers in Bioscience, Remote Sensing, and Computer Science will be able to utilize the algorithms and the user-friendly tool for learning and teaching in relation to plant health science and image pattern recognition related topics.

1.2 Wider users
We anticipate that the project will generate wider user interest, in particular from the general public. Our ICT-enabled intelligent solution will increase operability, measurability, visibility for the general public who have little knowledge about the crop diseases. One potential impact of this research will be to educate broader communities and raise public awareness and understanding of plant pathology, and to support the future of plant pathology.

We will adopt different impact activities to ensure all the potential beneficiaries have the opportunity to benefit from this research including:
1) The dissemination of project deliverables and software through the project website hosted at MMU, research publications in prestigious journals and appropriate conferences (e.g. IEEE Transactions on Automation Science and Engineering, Precision Agriculture, The British Society for Plant Pathology, etc.).
2) Seminars and workshops for researchers across multi-disciplines, end users and industry partners.
3) Public engagement activities for general audience by distributing flyers, posters and involving in outreach activities (Science Festival, etc.).
4) Dissemination through both internal publications (e.g.ManMetLife) and external publications to gain wider coverage of the project.
 
Description 1. we have developed deep learning methods based on UAV imagery which could be used for potato disease detection in this project.
2. We have developed a biologically interpretable two-stage Deep Neural Network (BIT-DNN) For Vegetation Recognition From Hyperspectral Imagery", by incorporating the prior-knowledge (i.e., biophysical and biochemical attributes and their hierarchical structures of target entities)-based spectral-spatial feature transformation into the proposed framework, capable of achieving both high accuracy and interpretability on HSI-based classification tasks. This work has been published on top journal IEEE Transactions on Geoscience and Remote Sensing, doi: 10.1109/TGRS.2021.3058782
Exploitation Route Our project outcomes could bee taken forward by:
1) Our collaborators from both China and UK, Plant disease control and management agencies, pathologists, and farmers/homegrowers. They will benefit from the project by using our decision support system to assist diagnosis of plant diseases and outbreak management.
2) Researchers from Bioscience, Remote Sensing, and Computer Science. This project will advance new practices for such researchers by delivering an ICT-based solution using data-driven approaches (new algorithms and software). These new algorithms can be applied and adapted to other bioscience related problems (e.g. not just crop diseases but also other plant diseases). This project will complement and strengthen the current research activities in the area of Food Security and Traceability in both host and the partner institution and will contribute to the international standing of UK research in
this area and beyond. Most importantly, the output of this project will provide a solid work to attract further funds from different sources (e.g. Horizon Europe).
3) Skilled researchers, fluent in crop disease detection, image processing, remote sensing, machine learning, big data processing and analytics will be trained by the unique interdisciplinary nature of this project which brings together researchers from Computer Science and Bioscience.
4) Research Councils and policy makers, who will be able to draw on the outcomes of the project to advise policy development and decision makers on appropriate strategies for future investment on enhancing plant disease management and improving food safety and traceability.
5) Industry practitioners/farm companies who provide solutions for plant disease control and prevention. Our system could be potentially commercialised and transformed to new products for disease diagnosis and prevention. Additionally, our platform is a standard, open platform, which can be easily plugged into existing plant disease control and management systems.
6) The users from education sectors. Students and lecturers in Bioscience, Remote Sensing, and Computer Science will be able to utilize the algorithms and the user-friendly tool for learning and teaching in relation to plant health science and image pattern recognition related topics.
Sectors Agriculture, Food and Drink,Digital/Communication/Information Technologies (including Software),Environment

URL https://www.mdpi.com/2072-4292/14/2/396
 
Description We held smart farming technology training workshop in. Our research findings have been used for skills training and capacity building in China (over 120 attendees).
First Year Of Impact 2022
Sector Agriculture, Food and Drink
 
Description Smart farming technology training workshop
Geographic Reach Multiple continents/international 
Policy Influence Type Influenced training of practitioners or researchers
 
Description Feasibility study for the development of an innovative Smart-farming app for smallholders in developing countries. (Agriculture-Productivity-Project) 'APP'
Amount £387,222 (GBP)
Funding ID 134039 
Organisation Innovate UK 
Sector Public
Country United Kingdom
Start 07/2020 
End 01/2022
 
Description UK-China Agritech Challenge: CropDoc - Precision Crop Disease Management for Farm Productivity and Food Security
Amount £449,193 (GBP)
Funding ID BB/S020969/1 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 02/2019 
End 02/2022
 
Title A Biologically Interpretable Two-Stage Deep Neural Network (BIT-DNN) for Vegetation Recognition From Hyperspectral Imagery 
Description Spectral-spatial-based deep learning models have recently proven to be effective in hyper-spectral image (HSI) classification for various earth monitoring applications such as land cover classification and agricultural monitoring. However, due to the nature of ``black-box'' model representation, how to explain and interpret the learning process and the model decision, especially for vegetation classification, remains an open challenge. This study proposes a novel interpretable deep learning model--a biologically interpretable two-stage deep neural network (BIT-DNN), by incorporating the prior-knowledge (i.e., biophysical and biochemical attributes and their hierarchical structures of target entities)-based spectral-spatial feature transformation into the proposed framework, capable of achieving both high accuracy and interpretability on HSI-based classification tasks. The proposed model introduces a two-stage feature learning process: in the first stage, an enhanced interpretable feature block extracts the low-level spectral features associated with the biophysical and biochemical attributes of target entities; and in the second stage, an interpretable capsule block extracts and encapsulates the high-level joint spectral-spatial features representing the hierarchical structure of biophysical and biochemical attributes of these target entities, which provides the model an improved performance on classification and intrinsic interpretability with reduced computational complexity. We have tested and evaluated the model using four real HSI data sets for four separate tasks (i.e., plant species classification, land cover classification, urban scene recognition, and crop disease recognition tasks). The proposed model has been compared with five state-of-the-art deep learning models. The results demonstrate that the proposed model has competitive advantages in terms of both classification accuracy and model interpretability, especially for vegetation classification. 
Type Of Material Technology assay or reagent 
Year Produced 2021 
Provided To Others? Yes  
Impact the project is still ongoing and will describe the impact in the later stage 
URL https://ieeexplore.ieee.org/document/9362293
 
Title Novel CropdocNet Model for Automated Potato Late Blight Disease Detection from Unmanned Aerial Vehicle-Based Hyperspectral Imagery 
Description This study proposes a novel end-to-end deep learning model (CropdocNet) for accurate and automated late blight disease diagnosis from UAV-based hyperspectral imagery. The proposed method considers the potential disease-specific reflectance radiation variance caused by the canopy's structural diversity and introduces multiple capsule layers to model the part-to-whole relationship between spectral-spatial features and the target classes to represent the rotation invariance of the target classes in the feature space. We evaluate the proposed method with real UAV-based HSI data under controlled and natural field conditions. The effectiveness of the hierarchical features is quantitatively assessed and compared with the existing representative machine learning/deep learning methods on both testing and independent datasets. The experimental results show that the proposed model significantly improves accuracy when considering the hierarchical structure of spectral-spatial features, with average accuracies of 98.09% for the testing dataset and 95.75% for the independent dataset, respectively 
Type Of Material Technology assay or reagent 
Year Produced 2022 
Provided To Others? Yes  
Impact ongoing 
URL https://www.mdpi.com/2072-4292/14/2/396
 
Title A Biologically Interpretable Two-stage Deep Neural Network (BIT-DNN) For Vegetation Recognition From Hyperspectral Imagery 
Description Spectral-spatial-based deep learning models have recently proven to be effective in hyper-spectral image (HSI) classification for various earth monitoring applications such as land cover classification and agricultural monitoring. However, due to the nature of ``black-box'' model representation, how to explain and interpret the learning process and the model decision, especially for vegetation classification, remains an open challenge. This study proposes a novel interpretable deep learning model--a biologically interpretable two-stage deep neural network (BIT-DNN), by incorporating the prior-knowledge (i.e., biophysical and biochemical attributes and their hierarchical structures of target entities)-based spectral-spatial feature transformation into the proposed framework, capable of achieving both high accuracy and interpretability on HSI-based classification tasks. The proposed model introduces a two-stage feature learning process: in the first stage, an enhanced interpretable feature block extracts the low-level spectral features associated with the biophysical and biochemical attributes of target entities; and in the second stage, an interpretable capsule block extracts and encapsulates the high-level joint spectral-spatial features representing the hierarchical structure of biophysical and biochemical attributes of these target entities, which provides the model an improved performance on classification and intrinsic interpretability with reduced computational complexity. We have tested and evaluated the model using four real HSI data sets for four separate tasks (i.e., plant species classification, land cover classification, urban scene recognition, and crop disease recognition tasks). The proposed model has been compared with five state-of-the-art deep learning models. The results demonstrate that the proposed model has competitive advantages in terms of both classification accuracy and model interpretability, especially for vegetation classification. 
Type Of Material Computer model/algorithm 
Year Produced 2021 
Provided To Others? Yes  
Impact this project is still ongoing 
URL https://ieeexplore.ieee.org/document/9362293
 
Title a novel end-to-end deep learning model (CropdocNet) 
Description a novel end-to-end deep learning model (CropdocNet) for accurate and automated late blight disease diagnosis from UAV-based hyperspectral imagery. The proposed method considers the potential disease-specific reflectance radiation variance caused by the canopy's structural diversity and introduces multiple capsule layers to model the part-to-whole relationship between spectral-spatial features and the target classes to represent the rotation invariance of the target classes in the feature space. We evaluate the proposed method with real UAV-based HSI data under controlled and natural field conditions. The effectiveness of the hierarchical features is quantitatively assessed and compared with the existing representative machine learning/deep learning methods on both testing and independent datasets. The experimental results show that the proposed model significantly improves accuracy when considering the hierarchical structure of spectral-spatial features, with average accuracies of 98.09% for the testing dataset and 95.75% for the independent dataset, respectively. 
Type Of Material Data analysis technique 
Year Produced 2022 
Provided To Others? Yes  
Impact ongoing 
URL https://www.mdpi.com/2072-4292/14/2/396
 
Description Academic partnership 
Organisation University of Chinese Academy of Sciences
Department Institute of Remote Sensing and Digital Earth
Country China 
Sector Charity/Non Profit 
PI Contribution we have worked together to get a research collaborative project.
Collaborator Contribution we have worked together to get a research collaborative project.
Impact still active
Start Year 2019
 
Title a novel end-to-end deep learning model (CropdocNet) for accurate and automated late blight disease diagnosis from UAV-based hyperspectral imagery 
Description a novel end-to-end deep learning model (CropdocNet)/software for accurate and automated late blight disease diagnosis from UAV-based hyperspectral imagery 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2021 
Impact ongoing 
 
Title a novel interpretable deep learning model-a biologically interpretable two-stage deep neural network (BIT-DNN) 
Description a novel interpretable deep learning model-a biologically interpretable two-stage deep neural network (BIT-DNN), by incorporating the prior-knowledge (i.e., biophysical and biochemical attributes and their hierarchical structures of target entities)-based spectral-spatial feature transformation into the proposed framework, capable of achieving both high accuracy and interpretability on HSI-based classification tasks. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2022 
Impact ongoing 
URL https://ieeexplore.ieee.org/document/9362293
 
Description An invited article published in New Food Magazine: "Could AI ease food security fears?", New Food , 15/2/2021 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Media (as a channel to the public)
Results and Impact Prof. Liangxiu Han has an invited article published in New Food Magazine: "Could AI ease food security fears?", New Food , 15/2/2021 https://www.newfoodmagazine.com/article/138114/ai-and-food-security/
Year(s) Of Engagement Activity 2021
URL https://www.newfoodmagazine.com/article/138114/ai-and-food-security/
 
Description An invited talk of "Precision Agriculture: A Big Data Driven, AI enabled Approach to Crop Disease Diagnosis" 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Liangxiu Han was invited to give an invited talk of "Precision Agriculture: A Big Data Driven, AI enabled Approach to Crop Disease Diagnosis" on The 2nd Crop Pest and Diseases (P&D) Remote Sensing Conference, China. 30 Aug
Year(s) Of Engagement Activity 2021
 
Description An invited talk on International workshop cum training programme titled "Green Growth Strategies for Climate Resilience and Disaster Risk Reduction (DRR): Policies, Pathways and Tools" 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Prof. Liangxiu Han was invited to give a talk on International workshop cum training programme titled "Green Growth Strategies for Climate Resilience and Disaster Risk Reduction (DRR): Policies, Pathways and Tools" from 26th to 28th November, 2020. Organised by the Centre for Ecological Economics and Natural Resources (CEENR), Institute for Social and Economic Change (ISEC), Bangalore and the National Institute of Disaster Management (NIDM), Ministry of Home Affairs, New Delhi, India. http://www.isec.ac.in/2_Final%20Flyer%20With%20Programme%20Schedule.pdf
Year(s) Of Engagement Activity 2020
URL http://www.isec.ac.in/2_Final%20Flyer%20With%20Programme%20Schedule.pdf
 
Description Article in food magazine: "Could AI ease food security fears?" 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Published an article in New Food Magazine:
"Could AI ease food security fears?", New food Magazine, 15/2/2021 https://www.newfoodmagazine.com/article/138114/ai-and-food-security/
Year(s) Of Engagement Activity 2021
URL https://www.newfoodmagazine.com/article/138114/ai-and-food-security/
 
Description British Science Week 2023 at Manchester Museum 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Schools
Results and Impact We will showcase our research work --AI for precision agriculture at the British Science week 2023 at Manchester Museum. There have 500-600 pupils attending the event.
Year(s) Of Engagement Activity 2023
 
Description Keynote speaker for IEEE PES Women in Power and IEEE Women in Engineering 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Keynote speaker, "Meeting Societal Challenges: Big Data Driven AI-enabled Approaches", The Power of Data: Data Science Meeting by IEEE PES Women in Power and IEEE Women in Engineering, On 23rd June 2021,
the IEEE PES Women in Power and IEEE Women in Engineering in UK & Ireland celebrated the International Women in Engineering Day with this engaging webinar. https://www.ieee-ukandireland.org/watch-again-the-power-of-data-data-science-meeting/,
Year(s) Of Engagement Activity 2021
 
Description an invited Panel Speech "The Future of Digital Twins: The Role of Big Data/Artificial Intelligence", 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Prof. Liangxiu Han was invited to give a Panel Speech "The Future of Digital Twins: The Role of Big Data/Artificial Intelligence",   the 18th IEEE International Symposium on Parallel and Distributed Processing with Applications (ISPA-2020), the 10th IEEE International Conference on Big Data and Cloud Computing (BDCloud-2020), the 13th IEEE International Symposium on Social Computing and Networking (SocialCom-2020), and the 10th IEEE International Conference on Sustainable Computing and Communications (SustainCom-2020), 17-19 December 2020
Year(s) Of Engagement Activity 2020
URL https://hpcn.exeter.ac.uk/socialcom2020/