4D-HSI-4-Free: Integrating Sensors & Vision Process Engineering to Deliver a Tool for revealing 3D Hyperspectral Agri-data from 2D Images

Lead Research Organisation: University of Manchester
Department Name: Electrical and Electronic Engineering

Abstract

Hyperspectral and multispectral imaging for crop phenotyping and stress analysis has seen significant growth in recent years for wide area scanning, notably from satellites and aircraft. This has been principally based on variants of the Normalised Difference Vegetation Index, 'red edge', ratios and / or chlorophyll fluorescence bands. Though more subtle agri-analysis is being introduced within these remote monitoring technologies, notably with the launch of the ESA Sentinel family of satellites, the reliance on sunlit imaging and the physical engineering limits on resolving power of the optics, alongside atmospheric variations in absorbance and refraction, set hard boundaries on the spatial and spectral sensitivity.

In more recent years Close-Proximity Hyperspectral Imaging (CP-HSI) has been progressed by a number of research groups, in order to ground-truth remotely collected images as well as to gain data with greater subtlety and differentiating power than can be obtained from the remote platforms, due to the physical constraints described. Additionally, CP-HSI offers the potential for greater data availability by avoiding the issues of cloud-cover, and other obstructions, as well as the lack of access to images whilst satellites are orbiting. These CP-HSI studies have tended to adopt the same high-cost passive HSI instruments, as used by the geospatial imaging sector, and then take advantage of the variations in plant and soil data arising intra- and inter-crop canopy to identify the new agri-features.

The e-Agri research team at the University of Manchester have worked over recent years on translating the above concepts into lower-cost and miniaturised engineered systems through replacing the passive HSI instrumentation with active systems, based on broadband proprietary silicon imaging detectors coupled with narrowband LED sources. The resulting Active Close-Proximity (ACP) HSI systems are not only several orders of magnitude cheaper and higher spatial resolution, than their remote passive equivalents, they also offer spectral signal-to-noise that can far exceed what is possible from the passive instruments.

Encouraging as this is, there are engineering challenges associated with active CP-HSI, not least the need to compensate for the orientation affects from position of the biological specimen with respect to the sensor system as well prevent corruption of the spectral data by variations in the lighting location and optical power of the LEDs. These are reconcilable through integrating the optical engineering of multiple sources alongside Photometric Stereo (PS) image reconstruction. The PS technique has arisen as an alternate technique to structured-light for delivering surface normals and texture information at unprecedented spatial resolution, while using inherently low-cost equipment. PS involves capturing a number of images (3+) of an object under varying but known lighting, with each image providing a constraint on the orientation at each pixel. Work pioneered by the CMV has allowed the PS technique to move beyond a laboratory setting towards real-word applications, by relaxing prior assumptions, such as Lambertian surface reflection, collimated illumination and by extending the technique to moving applications.

The team will deliver a new class of 4-Dimensional imaging within an accessible tool which will only marginally increases the component cost versus the 2D hyperspectral precursor. The project is achievable as it will be accelerated via the e-Agri UoM team partnering with the Centre for Machine Vision, at the Bristol Robotics Laboratory, to translate their PS algorithms so that they may be integrated within the ACP-HSI sensors research. The merger of these capabilities will prove the viability of a miniaturised 4D imaging system and deliver a proof-of-principal prototype system for characterisation against plant, animal and aquatic samples.

Technical Summary

Hyperspectral and multispectral imaging for crop phenotyping and stress analysis has seen significant growth in recent years for wide area scanning, notably from satellites and aircraft. This has been principally based on variants of the Normalised Difference Vegetation Index, 'red edge', ratios and / or chlorophyll fluorescence bands. Though more subtle agri-analysis is being introduced within these remote monitoring technologies, notably with the launch of the ESA Sentinel family of satellites, the reliance on sunlit imaging and the physical engineering limits on resolving power of the optics, alongside atmospheric variations in absorbance and refraction, set hard boundaries on the spatial and spectral sensitivity.

In more recent years Close-Proximity Hyperspectral Imaging (CP-HSI) has been progressed by a number of research groups, in order to ground-truth remotely collected images as well as to gain data with greater subtlety and differentiating power than can be obtained from the remote platforms, due to the physical constraints described. Additionally, CP-HSI offers the potential for greater data availability by avoiding the issues of cloud-cover, and other obstructions, as well as the lack of access to images whilst satellites are orbiting.

The team will deliver a new class of 4-Dimensional imaging within an accessible tool which will only marginally increases the component cost versus the 2D hyperspectral precursor. The project is achievable as it will be accelerated via the e-Agri UoM team partnering with the Centre for Machine Vision, at the Bristol Robotics Laboratory, to translate their PS algorithms so that they may be integrated within the ACP-HSI sensors research. The merger of these capabilities will prove the viability of a miniaturised 4D imaging system and deliver a proof-of-principal prototype system for characterisation against plant, animal and aquatic samples.

Planned Impact

The potential duties, that are actively being pursued by the Manchester team currently fall into four areas. The first is the early detection of fungal disease symptoms on leaf. The initial work in this area was in partnership with the plant scientist at the University of Bonn, and demonstrated that the ACP-HSI approach could be delivered in a package costing euro 100s as opposed to euro 100,000s. This then led on to a small proof of concept project, to show how such a package could exploited alongside a Smartphone handset to enable in-field plant diagnostics and autonomous advice services to be made available in an appropriately costed package for mass deployment to even the poorest smallholder farmers, to enhance their yields through more informed decision making. Incorporating the PS technique in this 'handset package' would add marginal cost but would enable plant scientists access to texture information as well as quantifiable HSI images from what could be 10,000s and then 1Ms of new data-points arising from farmers around the globe, in even the remotest areas. The potential to use 'Big Data' mining approaches on this more rich dataset, by virtue of the PS technique, are significant and have obvious implications on yield forecasting and crop stress mapping that impact on governmental decision making all the way through to mitigating insurance risks.

The second area of activity is in precision intra-row weed detection and eradication. This is due to both the growing resistance, by weeds, to such treatments or the lack of suitable products in the first place (e.g. ryegrass in cereals globally and Amaranthus spp and Conyza spp. in US maize), as well as the removal of older chemistries in a number of regions, notably the EU (ref. directive EU91-414), due to health concerns. The PS approach will become a key element of this autonomous weed identification technique for precision farming as the orientation information it offers will open up the most subtle of classification challenges, such as Blackgrass in post-emergence cereal crops, as the leaf-angle to the sensor will be effectively random.

The third area of ongoing research, that this PS approach will impact, is protein prediction in pre-harvest crops, notably winter wheat. Unlike previous work on late-stage growth or protein measurement in post-harvest grain, the ACP-HSI approach exploit the image data and not the composite NIR spectra. This potentially enables protein development to be tracked at much earlier stages in the growth cycle with corresponding smaller and more precise control of nutrient additions to gain acceptable post-harvest protein content, with the corresponding reduction in farm input costs and carbon-footprint. The PS technique offers the potential to take this tool from a manual technique, where wheat-ear samples are clamped and scanned by a handheld device whilst field-walking, to one that would be applicable to robotic autonomous scanning as it would reduce the impact of ear topology on the HSI data.

The fourth area is in partnership with the Natural Resources Institute (University of Greenwich) the ACP-HSI approach is being investigated to detect insect disease vectors, notably white fly for mosaic virus in cassava. Without PS the technique may be able to detect and, most importantly, speciate the adults (1-2mm) and possibly the nymphs (0.1-1mm), between potential disease carriers and benign insects. However, PS integrated to ACP-HSI offers the potential to reveal the eggs which are hidden within a leaf structure and so determine the full viral infection probability within a crop, and so prevent contamination of the future seasons seed-bank.

Additional applications for the PS technique are also potentially viable within: fish parasite detection, post-harvest produce quality traits in pack-houses (e.g. texture analysis of bagged salads) or soil organic carbon monitoring, amongst other duties.
 
Description New method for detecting meristems in plants. New approach to compensating for whole plant structures when applying high definition multispectral imaging, negating the leaf orientation effects. The latter has now led on to a patent filing (ref: 20170156/GB; HGF Ref. P246887GB / CAB; New UK Patent Application; Multispectral Diagnostic Device; The University of Manchester) and the formation of a new spin-out Instrumentation Company from Manchester University.

The patent filing has now been completed and as such an embargoed paper on the topic and underlying research has now been published:

Multispectral imaging for pre-symptomatic analysis of light leaf spot in oilseed rape; Veys, C., Chatziavgerinos, F., Bin Ghaith Alsuwaidi, A. R. S. A., Hibbert, J., Hansen, M., Bernotas, G., Smith, M., Yin, H., Rolfe, S. & Grieve, B., 2019, In : Plant Methods. 15, 40, p. 1-12 12 p.; DOI:10.1186/s13007-019-0389-9

A portable variant of the technology is now being commercially manufactured, under license, for testing with research users.
Exploitation Route For use in the non-chemical management of weed infestation by the farming sector, precision nutrient application, in-field phenotyping and early detection of biotic and abiotic stresses in plants.
Sectors Agriculture, Food and Drink,Digital/Communication/Information Technologies (including Software),Electronics

 
Description Linked to supporting the weed control project with Syngenta PLC. Follow-on full proposal submitted to BBSRC (ref: BB/R019770/1) and successful securing of an ICURe Innovate-UK award for commercial translation of the technology The patent associated with this project has now resulted in the formation of the new instrumentation company, Fotenix Ltd (www.fotenix.tech, UK Company Number 11346942). This company has been assigned, from the University, the core IP upon which it is founded. As a consequence, since the company formation it now employs 1 full time and 2 part-time managerial and technical staff and has secured first phase funding and Innovate-UK grants. This currently comprises of £158K from equity investment and 3 grants, with an additional income to the business of £197K ('Feasibility of using 3D Multispectral Imaging for enhanced classification of fruit ripeness and disease', Project number: 15290, £49,717; 'Co-ordinated technology development to provide an optimised and integrated system of leading vertical farming technologies', Project number: 25959, £55,993; 'The First Fleet. The world's first fleet of multi modal soft fruit robots', Project number: 26836, £91,419). The University research group, as led up by Prof Grieve, has also been awarded a BBSRC Impact Acceleration Account (IAA) award to help deliver the next phase of the technology, incorporating machine learning and AI.
First Year Of Impact 2018
Sector Agriculture, Food and Drink,Electronics
Impact Types Economic

 
Description Input to UKRI Agri-Food Strategy Panel on future research policy
Geographic Reach National 
Policy Influence Type Participation in a national consultation
Impact The finding of the integrated biology, electronics engineering and machine learning technologies exemplified within the projects identified, and their impacts on the Agri-Food sector, have been provided as evidence for the future UK-RI strategy, for UK Global Food Security and supporting international development in the GCRF programme.
 
Description BBSRC IAA The University of Manchester
Amount £300,000 (GBP)
Funding ID BB/S506692/1 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 03/2018 
End 03/2021
 
Description IKnowFood: Integrating Knowledge for Food Systems Resilience
Amount £590,212 (GBP)
Funding ID BB/N020626/1 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 10/2016 
End 10/2020
 
Title Demonstration of multispectral kinematic stereo for precision agricultural duties 
Description The mechanism to deliver 3D multispectral topological data from a single imaging detector has been developed in such a manner that it was successfully deployed at field scale using a tractor toolbar mounted system operating at speeds of up to 10kph. Kinematic Height Estimation: In order to image a field efficiently at high speed, it is essential that each frame captures the largest feasible area while maintaining sufficient detail to accurately identify and locate small plants. However, where plants are not located directly below the camera, variations in their height can result in an inaccurate determination of their position due to parallax error. In our system, the usable frame area (assuming a parallax error of less than 5% is acceptable) is approximately a 19% slice across the centre. While it is possible to disregard all but this area within the image, the camera would need to capture five times more images in order to cover the same area. This effectively reduces the camera frame rate from 90 fps to ˜17 fps, limiting the maximum speed of the system and the ability to incorporate additional wavelengths in the future. A stereo imaging technique using a single camera in motion was chosen to compensate for the error, eliminating the need to purchase additional equipment. Given that the imaging system is in constant motion, and the camera has a high frame rate, it is possible to use a single camera to achieve kinetic depth estimation. As the camera moves between two consecutive images, taller objects will appear to move faster than their shorter counterparts. Therefore the height disparity between the objects can be calculated using a dense optical flow algorithm. The system utilises the Farneback optical flow algorithm, chosen because of the low processing time and availability of the GPU kernel in the OpenCV library. Tests were carried out to verify the reliability of the algorithm using objects of known height imaged twice with 2.1 cm camera displacement between images (chosen from the 90 fps camera frame rate and target 5 km/h system speed). A height disparity map was produced from these images, which shows significant error in the height estimation algorithm. The error was attributed to the small number of features, and thus, low spatial frequency of the image. Given that neighbouring pixels had very similar values, the algorithm was not able to accurately calculate the pixel shift. Height estimation using kinematic stereo method The error in the disparity values can be reduced by applying a Laplacian filter to the image and summing the filtered and non-filtered images. This can improve the texture of the image, preserving the high spacial frequency components. The error in disparity values across the image is reduced. Approximately 1 cm in height corresponds to 3 units in the height disparity map, showing a good estimation of the object height by the algorithm. However, some inconsistencies still exist at object edges, which can be mitigated by taking the disparity value in the middle of any given object. Using the equation below the real distance can be calculated, allowing the system to provide accurate position information of individual plants across an entire frame. The distance between two frames is calculated using a speed estimation algorithm and the time between images, as: x-x' = bf / d where x and x' are the point of interest in frame one and the one in frame two respectively; b is the distance between frames; f is the focal length of the camera; and d is the disparity map (object height). 
Type Of Material Technology assay or reagent 
Year Produced 2020 
Provided To Others? Yes  
Impact Precision weeding can significantly reduce or even eliminate the use of herbicides in farming. To achieve high-precision, individual targeting of weeds, high-speed, low-cost plant identification is essential. Our system using the red, green, and near-infrared reflectance, combined with a size differentiation method, is used to identify crops and weeds in lettuce fields. Illumination is provided by LED arrays at 525, 650, and 850 nm, and images are captured in a single-shot using a modified RGB camera. A kinematic stereo method is utilised to compensate for parallax error in images and provide accurate location data of plants. The system was verified in field trials across three lettuce fields at varying growth stages from 0.5 to 10 km/h. In-field results showed weed and crop identification rates of 56% and 69%, respectively. Post-trial processing resulted in average weed and crop identifications of 81% and 88%, respectively. 
URL https://www.mdpi.com/1424-8220/20/2/455/htm
 
Title Multispectral Diagnostic Device 
Description Embodiments of the present invention provide an apparatus for determining spectral information of 5 a three-dimensional object, comprising a cavity (110) for location in relation to the object, an imaging light source (120) located in relation to the cavity, wherein the imaging source is controllable to selectively emit light in a plurality of wavelength ranges, structured light source (130) for emitting structured illumination toward the object, wherein the structured light source comprises a plurality of 10 illumination devices arranged around the cavity, one or more imaging devices (140) for generating image data relating to at least a portion of the object, a control unit, wherein the control unit (1100) is arranged to control the structured light source to emit the structured illumination and to control the imaging light source to emit light in a selected one or more of the plurality of wavelength ranges, a data storage unit (1120) arranged to store image data corresponding to the structured illumination and each of the selected one or more of the plurality of wavelength ranges, and processing means (1110) arranged to determine depth information relating to at least a portion of the object in dependence on the image data corresponding to the structured illumination stored in the data storage means. 
IP Reference WO2019122891 
Protection Patent application published
Year Protection Granted 2019
Licensed Yes
Impact Creation of a new company on the basis of the patent
 
Company Name FOTENIX LIMITED 
Description The Fotenix company was co-founded by the PI (Prof Bruce Grieve) and one of his ex-PhD students (Dr Charles Veys). The business has subsequently attracted ~£225K of Innovate-UK innovation grants (titles listed below) to develop and deploy 3D multispectral imaging instrumentation for tractor and robotic mounting, notably for ripening characteristics in soft fruits and biotic stress detection and identification directly on crop leaves. Innovate-UK Grants: (1) Co-ordinated technology development to provide an optimised and integrated system of leading vertical farming technologies, Project number: 25959, Competition: Productive and sustainable crop and ruminant agricultural systems (2) The First Fleet. The world's first fleet of multi modal soft fruit robots, Project number: 26836, Competition: Productive and sustainable crop and ruminant agricultural systems (3) Co-ordinated technology development to provide an optimised and integrated system of leading vertical farming technologies, Project number: 105141, Competition: Productive and sustainable crop and ruminant agricultural systems 
Year Established 2018 
Impact The business has just completed early stage sales of the instrumentation for laboratory use in plant phenotyping laboratories in IBERS (Aberystwyth) and P3 (Sheffield). It is now in the process of attracting Series-A venture funding, to grow the business beyond the current 1 full time and 3 part time employees. First field trials of the robotic mount technology will be undertaken in Spring 2020 with production-ready systems available by end of the same year.
Website http://www.fotenix.texh
 
Description KTN Emerging Imaging Technologies in Agri-Food Workshop in Birmingham 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Wenhao Zhang presented our work on using machine vision / learning in agri-tech at the KTN Emerging Imaging Technologies in Agri-Food Workshop in Birmingham
Year(s) Of Engagement Activity 2018
URL https://ktn-uk.co.uk/news/emerging-imaging-technologies-in-agri-food
 
Description Phenom-UK Launch Conference 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact The 3D multispectral system research forms part of the original BBSRC Phenom-UK project proposal, of which the PI (Prof Bruce Grieve) was a co-author. The multispectral research and technology, as well as the subsequent commercial variant of the technology, in the form of products from the Fotenix Ltd spin-out, have secured a slot to be presented at the launch event on 11th February 2020.
Year(s) Of Engagement Activity 2020
URL https://www.phenomuk.net/event/phenomuk-annual-meeting-at-national-physical-laboratory-teddington-fe...
 
Description Royal Academy of Engineering: "Frontiers of Engineering for Development: From Feeding People to Nourishing People". Madagascar 2019 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact The project has resulted in an invite from the Royal Academy of Engineering to Chair and manage a 1 day workshop within their November 2019 International Conference on Agri-Food, associated with delivering impact to ODA Recipient nations from smart technologies. The project will act as a central exemplar of how to deliver such impact.
Year(s) Of Engagement Activity 2019
 
Description Visit to potential collaborator, Ohio State University 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Study participants or study members
Results and Impact Following some chance discussions at the poster event in the US-Aid & Grand Challenges Seattle Convening back in May, associated with the Sentinel project, Prof Enrico Bonello and Dr Anna Conrad (Ohio State University), and Prof Bruce Grieve (University of Manchester, UK) made good on their promises at the event and met for more detailed meetings on Sept. 12-13, 2019 in Ohio State in Columbus, OH. This was catalyzed by Enrico and Anna's work, alongside Chris Wiegman and Scott Shearer in the Dept. of Food, Agricultural and Biological Engineering at OSU, on drone suspended single point near-infrared sensors that can be lowered into the plant canopy. These are being developed as part of the GC 'Aerial Plant Disease Surveillance by Spectral Signatures' project at OSU. Where the overlap with the sensor engineering team at Manchester came was the unique ability offered by the drone suspended gripper at OSU to make high-resolution handheld leaf-level and / or active sensors into drone-compatible devices. This offers a very different crop surveillance opportunities to those typical provided by passive medium or low-resolution sensor systems that are typically mounted directly on agricultural drones. Of particular interest, was the potential to incorporate the next generation of close-proximity active multispectral imagers that are being developed at Manchester, which will also be equipped with combined fluorescence imaging and oscillatory modulation of the actinic light. Though computationally complex, the latter can be engineered from low-cost optical components, which then result in plants rapidly revealing their innermost metabolic processes. These techniques are particularly powerful for early identification of viral, fungal and bacterial crop stresses, but MUST be applied close to the leaf. They also provide complementary information on the spatial spread of diseases, which makes them particularly useful for viruses, like those causing Cassava Mosaic and Brown Streak diseases, that are expressed in different ways throughout the plant. However, as low-cost imaging systems, they only work in the 380-1100 nm wavelength range, which again made a nice complement to the 1300-2500 nm spot measurements being delivered by the commercial spectrometer module being used at OSU. The conversation led to idea of integrating of the next generation Manchester multispectral imaging modules into the drone-supported gripper / sensor module from OSU to create a very new range of crop disease sensing capabilities, especially in difficult to reach farming areas. We may be able to do that initially over the coming year, even if in just a crude way using undergraduate student resources in the first place.
Year(s) Of Engagement Activity 2019