13TSB_ACT: Tractor-mount sensing for precision application of Nitrogen and control of milling wheat protein content.

Lead Research Organisation: University of Manchester
Department Name: Electrical and Electronic Engineering

Abstract

The project brings together agronomy research, on rapid protein assays for milling wheat, with engineering of photonic sensors, image recognition & mechatronic systems. The ultimate goal is to deliver a tractor-mount scanning unit for autonomous mapping of protein content across wheat fields, to a spatial resolution better than 2 square metres at full field application speeds (17km/hr) for precision application of nitrogen (N). N is the primary input cost and 80% of the carbon footprint, in milling wheat production, however it is over applied in 3 out of 4 cases. This system will enable growers to dynamically map protein distribution in the crop canopy so that all areas attain the threshold 13% protein content. The Sainsbury's 'Camgrain Consortium' of 500 growers will act as early adopters for the retrofittable system, which will incorporate emerging technologies in high-speed infrared photonics, steered on a miniature robotic gimbal through rapid embedded image processing.

Technical Summary

The project brings together agronomy research, on rapid protein assays for milling wheat, with engineering of photonic sensors, image recognition & mechatronic systems. The ultimate goal is to deliver a tractor-mount scanning unit for autonomous mapping of protein content across wheat fields, to a spatial resolution better than 2 square metres at full field application speeds (17km/hr) for precision application of nitrogen (N). N is the primary input cost and 80% of the carbon footprint, in milling wheat production, however it is over applied in 3 out of 4 cases. This system will enable growers to dynamically map protein distribution in the crop canopy so that all areas attain the threshold 13% protein content. The Sainsbury's 'Camgrain Consortium' of 500 growers will act as early adopters for the retrofittable system, which will incorporate emerging technologies in high-speed infrared photonics, steered on a miniature robotic gimbal through rapid embedded image processing.

Planned Impact

Phase 1 of the exploitation strategy will be to gain initial market uptake through the 'Camgrain Consortium' of growers. This consortium of over 500 growers, in the East Anglia area (UK), includes circa 30 identified 'technology aware' farm managers all of whom feed into the Camgrain Stores cooperative group for subsequent distribution of the grains. To give this project focus and ensure a tangible deliverable by the closure date, Finches Farm has been identified from this sub-grouping to provide the first beta-test facility for the tractor-mount sensor units. The farm incorporates an appropriate area of premium milling wheat production (70 hectares) on which to base the first commercial demonstration. Camgrain Stores will receive the harvested grain from Finches Farm, as well as the Phase 2 adopters, and will undertake standardised QA laboratory analysis for grain protein and moisture content of representative samples of each of the delivered tanker batches. By virtue of previous records, held by Camgrain, on grain protein-content variability from Finches Farm the strategy will be to gain broader farmer uptake, and provide evidence for longer-term adoption by Finches Farm, through providing quantitative analysis of the reduction in grain protein and moisture variability following the introduction of the sensor-controlled approach. This will notionally require at least 3-5 seasons of reduced grain variability data, alongside the corresponding field records of reduced total N application volumes, before adequate evidence maybe gained to convince the wheat-farming industry to undertake significant investment in this smart N sensing technology.

The Phase 2 project will position the technology at an appropriate degree of technology readiness for entering a Phase 3 programme of sensor deployment and further value-engineering of the unit, in partnership with the large machinery providers (Garfords, JCB, etc), who will also form an element of the Phase 2 partnership. Phase 3 sales of the units will be to the broader Camgrain grower consortium, possibly at preferential rates so that Camgrain Stores can fully exploit the benefits of offering lower variability protein composition across the majority of silos as supplied by the Camgrain Consortium farmers. This would complement Sainsbury's ultimate goal which, can be expressed on two levels. Firstly to offer customers the higher-level ambition of a reassurance that Sainsbury's in-store bakeries have used the state-of-the-art in engineering to guarantee minimised impact on carbon footprint, using metrics which can be backed up with quantified field data from the sole supply of the in-store bakery flour from Camgrain growers. Secondly, the more pragmatic practical ambition to ensure that all of Sainsbury's in-store bakeries receive a uniformity of premium-grade flour such that bakery products produced in the supermarket are consistently high quality, give the same taste-experience and are free from defects (cracking, charring, burning) irrespective of the national location of any given store and its farm-to-mill supply chain. The Phase 4 exploitation would then see the tractor-mount sensor technology, and its integrated patch field-spray systems for precision N application, marketed to non Camgrain farmers at a full commercial rate. This would include export sales to wheat growers and supply-chains outside of the UK. The first of these being in farming regions that are similar to East Anglia within Northern Europe, notably in France and Germany. Phase 4 would also see the sensor technology extended to other applicable protein crops, in particular those with greater relevance outside of the UK, such as soybean and maize. Sainsbury's and the Camgrain growers will maintain the competitive edge in these developments by retaining the position as the first route for early-adoption and beta-testing of any relevant new aspect of the on tractor protein sensors.
 
Description The protein system has been demonstrated within a handheld unit which offers significantly greater flexibility of operation than the original tractor-mount concept.

A new handheld variant of the instrument is now being commercially produced, under license, for in-field deployment.
Exploitation Route The project has now lead on to a submission with ADAS, Sainsbury's and Precision Decisions for a 2016 Industrial Award under Round 5 of the Agri-Tech Catalyst
Sectors Agriculture, Food and Drink,Electronics,Pharmaceuticals and Medical Biotechnology

URL https://www.youtube.com/watch?v=S7Hiqc51c5U
 
Description To date the findings have been developed and shared with Farm managers, principally in the East Anglia and Cambridgeshire area, as part of the Camgrain consortium, so as to develop the product concept, following field trials o the proof-of-concept system in 2015. The work has also subsequently led on to a new partnership between ADAS and Precision Decisions Ltd, that has supported the PhD CASE award for Mr Dingle Jose at the University of Manchester, in the PI's team, to deliver on the hardware associated with a pre-production instrument and trials in-field. This is further underpinned by a submission in 2017 (Nov) to the Innovate-UK 'Health and Lifesciences' programme (ref: TS/R020655/1) to deliver a high-speed system suitable for agronomists to use. Currently a handheld variant of the technology is under trial for nitrogen management in wheat, at the test farm site in Yorkshire owned by Precision Decisions Ltd.
First Year Of Impact 2019
Sector Agriculture, Food and Drink
Impact Types Economic

 
Description Input to UKRI Agri-Food Strategy Panel on future research policy
Geographic Reach National 
Policy Influence Type Participation in a national consultation
Impact The finding of the integrated biology, electronics engineering and machine learning technologies exemplified within the projects identified, and their impacts on the Agri-Food sector, have been provided as evidence for the future UK-RI strategy, for UK Global Food Security and supporting international development in the GCRF programme.
 
Description IKnowFood: Integrating Knowledge for Food Systems Resilience
Amount £590,212 (GBP)
Funding ID BB/N020626/1 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 10/2016 
End 10/2020
 
Title Demonstration of multispectral kinematic stereo for precision agricultural duties 
Description The mechanism to deliver 3D multispectral topological data from a single imaging detector has been developed in such a manner that it was successfully deployed at field scale using a tractor toolbar mounted system operating at speeds of up to 10kph. Kinematic Height Estimation: In order to image a field efficiently at high speed, it is essential that each frame captures the largest feasible area while maintaining sufficient detail to accurately identify and locate small plants. However, where plants are not located directly below the camera, variations in their height can result in an inaccurate determination of their position due to parallax error. In our system, the usable frame area (assuming a parallax error of less than 5% is acceptable) is approximately a 19% slice across the centre. While it is possible to disregard all but this area within the image, the camera would need to capture five times more images in order to cover the same area. This effectively reduces the camera frame rate from 90 fps to ˜17 fps, limiting the maximum speed of the system and the ability to incorporate additional wavelengths in the future. A stereo imaging technique using a single camera in motion was chosen to compensate for the error, eliminating the need to purchase additional equipment. Given that the imaging system is in constant motion, and the camera has a high frame rate, it is possible to use a single camera to achieve kinetic depth estimation. As the camera moves between two consecutive images, taller objects will appear to move faster than their shorter counterparts. Therefore the height disparity between the objects can be calculated using a dense optical flow algorithm. The system utilises the Farneback optical flow algorithm, chosen because of the low processing time and availability of the GPU kernel in the OpenCV library. Tests were carried out to verify the reliability of the algorithm using objects of known height imaged twice with 2.1 cm camera displacement between images (chosen from the 90 fps camera frame rate and target 5 km/h system speed). A height disparity map was produced from these images, which shows significant error in the height estimation algorithm. The error was attributed to the small number of features, and thus, low spatial frequency of the image. Given that neighbouring pixels had very similar values, the algorithm was not able to accurately calculate the pixel shift. Height estimation using kinematic stereo method The error in the disparity values can be reduced by applying a Laplacian filter to the image and summing the filtered and non-filtered images. This can improve the texture of the image, preserving the high spacial frequency components. The error in disparity values across the image is reduced. Approximately 1 cm in height corresponds to 3 units in the height disparity map, showing a good estimation of the object height by the algorithm. However, some inconsistencies still exist at object edges, which can be mitigated by taking the disparity value in the middle of any given object. Using the equation below the real distance can be calculated, allowing the system to provide accurate position information of individual plants across an entire frame. The distance between two frames is calculated using a speed estimation algorithm and the time between images, as: x-x' = bf / d where x and x' are the point of interest in frame one and the one in frame two respectively; b is the distance between frames; f is the focal length of the camera; and d is the disparity map (object height). 
Type Of Material Technology assay or reagent 
Year Produced 2020 
Provided To Others? Yes  
Impact Precision weeding can significantly reduce or even eliminate the use of herbicides in farming. To achieve high-precision, individual targeting of weeds, high-speed, low-cost plant identification is essential. Our system using the red, green, and near-infrared reflectance, combined with a size differentiation method, is used to identify crops and weeds in lettuce fields. Illumination is provided by LED arrays at 525, 650, and 850 nm, and images are captured in a single-shot using a modified RGB camera. A kinematic stereo method is utilised to compensate for parallax error in images and provide accurate location data of plants. The system was verified in field trials across three lettuce fields at varying growth stages from 0.5 to 10 km/h. In-field results showed weed and crop identification rates of 56% and 69%, respectively. Post-trial processing resulted in average weed and crop identifications of 81% and 88%, respectively. 
URL https://www.mdpi.com/1424-8220/20/2/455/htm
 
Description Wheatscan 2 
Organisation ADAS
Country United Kingdom 
Sector Private 
PI Contribution Compile bid and created partnership for Wheatscan 2 bid into Agri-Tech Catalyst Round 5 - Industrial Award
Collaborator Contribution Equal contribution by Sainsbury's, ADAS, Precision Decisions & Manchester
Impact Bid into Agri-Tech Catalyst Round 5
Start Year 2015
 
Title Multispectral Diagnostic Device 
Description Embodiments of the present invention provide an apparatus for determining spectral information of 5 a three-dimensional object, comprising a cavity (110) for location in relation to the object, an imaging light source (120) located in relation to the cavity, wherein the imaging source is controllable to selectively emit light in a plurality of wavelength ranges, structured light source (130) for emitting structured illumination toward the object, wherein the structured light source comprises a plurality of 10 illumination devices arranged around the cavity, one or more imaging devices (140) for generating image data relating to at least a portion of the object, a control unit, wherein the control unit (1100) is arranged to control the structured light source to emit the structured illumination and to control the imaging light source to emit light in a selected one or more of the plurality of wavelength ranges, a data storage unit (1120) arranged to store image data corresponding to the structured illumination and each of the selected one or more of the plurality of wavelength ranges, and processing means (1110) arranged to determine depth information relating to at least a portion of the object in dependence on the image data corresponding to the structured illumination stored in the data storage means. 
IP Reference WO2019122891 
Protection Patent application published
Year Protection Granted 2019
Licensed Yes
Impact Creation of a new company on the basis of the patent
 
Company Name FOTENIX LIMITED 
Description The Fotenix company was co-founded by the PI (Prof Bruce Grieve) and one of his ex-PhD students (Dr Charles Veys). The business has subsequently attracted ~£225K of Innovate-UK innovation grants (titles listed below) to develop and deploy 3D multispectral imaging instrumentation for tractor and robotic mounting, notably for ripening characteristics in soft fruits and biotic stress detection and identification directly on crop leaves. Innovate-UK Grants: (1) Co-ordinated technology development to provide an optimised and integrated system of leading vertical farming technologies, Project number: 25959, Competition: Productive and sustainable crop and ruminant agricultural systems (2) The First Fleet. The world's first fleet of multi modal soft fruit robots, Project number: 26836, Competition: Productive and sustainable crop and ruminant agricultural systems (3) Co-ordinated technology development to provide an optimised and integrated system of leading vertical farming technologies, Project number: 105141, Competition: Productive and sustainable crop and ruminant agricultural systems 
Year Established 2018 
Impact The business has just completed early stage sales of the instrumentation for laboratory use in plant phenotyping laboratories in IBERS (Aberystwyth) and P3 (Sheffield). It is now in the process of attracting Series-A venture funding, to grow the business beyond the current 1 full time and 3 part time employees. First field trials of the robotic mount technology will be undertaken in Spring 2020 with production-ready systems available by end of the same year.
Website http://www.fotenix.texh
 
Description Phenom-UK Launch Conference 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Industry/Business
Results and Impact The 3D multispectral system research forms part of the original BBSRC Phenom-UK project proposal, of which the PI (Prof Bruce Grieve) was a co-author. The multispectral research and technology, as well as the subsequent commercial variant of the technology, in the form of products from the Fotenix Ltd spin-out, have secured a slot to be presented at the launch event on 11th February 2020.
Year(s) Of Engagement Activity 2020
URL https://www.phenomuk.net/event/phenomuk-annual-meeting-at-national-physical-laboratory-teddington-fe...