An affordable stereoscopic camera array system for capturing real-time 3D responses to vegetation dense environments

Lead Research Organisation: University of Edinburgh
Department Name: Sch of Biological Sciences

Abstract

To feed our growing global population, the productivity of staple crops will require an increase in yields of ca. 60% by 2050. Realisation of this significant challenge is currently hampered by insufficient capacity of the plant science and Agri-Tech communities to analyse existing plant genetic resources for their interaction with the environment. Plant phenotyping is an emerging science that links genomics with plant ecophysiology and agronomy. The collection of architectural traits of a plant throughout the life cycle (the phenotype) is a result of dynamic interactions between the genetic background (the genotype) and the physical world in which the plant develops (the environment). These interactions determine plant performance and productivity measured as accumulated biomass and commercial yield, and resource use efficiency. A critical parameter of a plant's phenotype is the shape in three dimensions - the plant architecture - which reflects the adaption of a plant to environmental conditions, such as light quantity and quality, and temperature. We will bring together plant scientists, and image capture and machine vision experts to develop an affordable and user-friendly system for progressing on one of the most important issues in biology, our capacity to enhance the productivity of plants. We will combine a robust photometric stereo approach with software using optimised learning algorithms that will produce standardised outputs that can be readily integrated across scales with other biological data sets.

Technical Summary

There is an urgent need for new technologies that allow us to monitor and predict the impact of abiotic and biotic stresses on seasonal plant growth. Such tools could provide quantitative data on crop yield traits and plant fitness markers in natural habitats. The current project is a collaboration between plant researchers, modellers, engineers and image analysis specialists that aims to take a new approach, by producing i) an affordable and robust photometric stereo platform for 3D capture of rosette growth in Arabidopsis that can be extended to other crops, such as related Brassica species (e.g. Brassica rapa), and ii) a standardised open source software based in optimised machine learning algorithms and fronted by a biologist-friendly user interface for simplifying the extraction of traits of interest. To test the system we will target an important plant adaptive response that can negatively impact plant biomass and crop yield - the shade avoidance response (SAR). Initial computational work will focus on processing image data to gather 3D information for plant reconstruction through intelligent integration over the surface normal field. Subsequent image/model analysis will centre on the optimisation of leaf segmentation algorithms using features commonly used in machine vision and new approaches rooted in unsupervised feature learning. Following segmentation, we will quantify the plant phenotypic traits required for evaluating the key features of SAR, including: leaf emergence rates, positioning, orientation, as well as alterations in leaf length, width, shape and expansion. The novel algorithms resulting from this research will be integrated into the existing UoE PhenoTiki package (http://phenotiki.com) so that we reach a wide base of users. We aim to combine 3D phenotype data with mathematical modelling to predict growth outcomes in differing growth scenarios and in unseasonal weather, including controlled environments and outside in field or natural habitats.

Planned Impact

Who will benefit from this research?
1. Academics and researchers in all fields of plant research and machine learning.
2. UK and international science base.
4. The Agro-Tech industry including biotechnologists and plant breeders seeking to increase plant productivity, and metabolic engineers.
5. Agricultural community and advisors.
6. The University of Edinburgh.
6. The postdoctoral researcher (PDRA), and research associate (RA).
7. Public.

How will they benefit?
1. The research will have a major impact on understanding of light signalling and shade avoidance, and enhance the UK's international standing in plant science. Currently no other equipment can perform these measurements elsewhere in the world, nor integrate them with the wealth of molecular data and integrated modelling expertise available at UoE.
2. The technology is designed to enable labs in universities, research institutes and the Agro-Tech industry to pursue phenotyping approaches that have previously been largely inaccessible to them. To acquire the technology following publication, researchers will access an already established open source online resource (PhenoTiki) with an already existing user community to download a user-friendly software and have the option to reproduce the phenotyping system in their own labs, using our published designs that where possible will be made available for 3D printing.
3. Our system will provide a means of standardisation across the phenotyping field, to inform research strategies towards enhanced food security, which will be a platform for direct translation to achieving this goal in food crop plants through yield improvements. The public interest in food security and sustainable crop production, the broad biological scope of the project, and the predictive, systems approach already highlight areas of Impact.
4. The PDRA and RA will receive a wide training in plant integrative biology, hardware design and machine vision approaches, full access to professional skills and wider training courses, and the opportunity to work with two different research institutes.

What will be done to ensure they benefit from this research?
1. Publish results in high-impact journals in a timely fashion, with open access.
2. Present research results at UK and international meetings and institutions.
3. Exploit existing contacts with other UK and international academics with relevant research interests as soon as any exploitable results/materials are generated.
4. Make informal contacts with industrialists as soon as exploitable results/materials are generated; recognise and protect intellectual property to ensure wise and fruitful exploitation.
5. Provide PDRA and RA training and mentoring available at UoE and CMV, including regular reviews to monitor progress and a career development plan. We will also encourage participation in all aspects of the dissemination of research results, and understanding of the wider implications and applications of the research.
7. Use results as part of our regular engagement with non-academic audiences, e.g. local interest groups, schools, local and national shows, science showcases, media.
 
Description Towards WP1 (Project management): Project is completed, McCormick is co-ordinating with Co-Is for the production of publications.

Towards WP2 (Rig design & construction) (100% complete): The PS-Plant rigs were completed on schedule. Rig 1 was improved with the addition of an NiR filter that substantially improved image capture. Due to the successful operation of the rigs, an additional rig was built in Sept 2017 to accelerate data generation.

Towards WP3 (Data capture) (100% complete): Data capture is complete for validation experiments, complete for the matrix" experiment designed to examine the performance of the rigs to capture Arabidopsis growth under different environmental conditions.

Towards WP4 (Computer Vision algorithm design) (100% complete): Bristol and UoE have completed labelling image files to train an rNN network using 3D image data (training set is hosted on Edinburgh DataShare at https://datashare.is.ed.ac.uk/handle/10283/3200). A rNN has been trained, leaf segmentation software and leaf tracking software have been completed and successfully used to track leaf movements in the matrix data set.

Towards WP5 (Analysis of 3D dynamic growth) (100%): Area and rosette shape analysis of the "matrix" experiment is complete. Work on evaluating individual leaf movement is complete.

Towards WP6 (Software integration) (100%): All PS-Plant software is completed with user friendly GUIs where appropriate. The software and Python source code will be made available online to co-ordinate with the publication of PS-Plant.

Towards WP7 (Dissemination): The project work has been disseminated at two conferences (IAMPS and CVPPP-ICCV). A manuscript has been submitted to the journal GigaScience that describes the PS-Plant system, demonstrates the novel biological data outputs in the matrix experiment, and provides a clear protocol for users to build a PS-Plant system and use the software for trait extraction.

IAA extension update (Oct 2018- present): We applied to the University of Edinburgh's BBSRC Impact Acceleration Account to expand. This award has help with the completion of the GigaScience manuscript (now in the second round of review), the further development and testing of the PS-Plant system with a variety of species (e.g. soybean, radishes, Brassica) to explore expanding the technology to other plants, and to perform marketing research to identify opportunities for developing plant phenotyping expertise in Edinburgh and help to identify potential academic and commercial phenotyping stakeholders in the UK/EU. We have been working with the Edinburgh Innovations team to explore the latter.

2019: Project work was successfully published in GigaScience "A photometric stereo-based 3D imaging system using computer vision and deep learning for tracking plant growth" (https://doi.org/10.1093/gigascience/giz056).
Exploitation Route The technology shows promise in terms of the data generated in Arabidopsis.
AMc successfully applied for IAA funding at UoE after the completion of the project, and we are now exploring the potential i) of integrating PS-Plant with gas exchange systems to simultaneously track leaf movement and photosynthesis, and ii) the use of PS-Plant with a variety of other species, of agricultural importance. We are enquiring as to the marketability of PS-Plant through interactions with Edinburgh Innovations.
Sectors Agriculture, Food and Drink,Digital/Communication/Information Technologies (including Software),Education

URL http://mccormick.bio.ed.ac.uk/
 
Description 2017: publication of project related work in a high impact factor journal (Plant Methods) has resulted in new open source software developed for public access (https://datashare.is.ed.ac.uk/handle/10283/2922). 2017-2018 Attendance at workshops at ILRI-BecA (Kenya) and the IITA (Zambia) has helped knowledge exchange of plant phenotyping with agronomists and breeders in Africa, who could utilise the low-cost tools developed in this project. 2017-ongoing: We successfully applied for funding from Phyconet to further develop a virtual reality outreach tool for the public to teach the public about the importance and shortcomings of Rubisco, the primary carboxylating enzyme in photosynthetic organisms. We have recently used the system at the 2017 Midlothian Festival, the 2018 Festival of Learning, and the 2018 RSB Biology week.. 2017-2018: Work on this project has led to a funded PhD students joining the lab in 2018 (EastBio scholarship) and co-supervision of PhD student with Sotirios Tsaftaris (School of Eng). 2019: Successful publication of work in GigaScience (https://doi.org/10.1093/gigascience/giz056), including release of hardware blueprints, software and training datasets for the PS-Plant system (See http://mccormick.bio.ed.ac.uk/ website). 2019: Successful application of a PhenomeUK Pilot Project Grant with Bristol CMV to develop a more advanced version of the system (PS-Plant+) in collaboration with the National Plant Phenomics Centre at Aberystwyth University (£25k).
First Year Of Impact 2017
Sector Agriculture, Food and Drink,Digital/Communication/Information Technologies (including Software)
Impact Types Cultural,Societal,Economic

 
Description GCRF One Planet Workshop at ILRI-BecA (Kenya).
Geographic Reach Africa 
Policy Influence Type Participation in a guidance/advisory committee
 
Description Interaction with USA: Moving Field Phenomics from Theory to Practice (Organised by UK Science & Innovation Network, British Consulate-General Los Angeles)
Geographic Reach North America 
Policy Influence Type Influenced training of practitioners or researchers
 
Description Practical Synthetic Biology Workshop (South Africa) - BBSRC call
Geographic Reach Africa 
Policy Influence Type Participation in a guidance/advisory committee
 
Description BBSRC IAA Grant
Amount £9,090 (GBP)
Funding ID BBSRC IAA PIII-013 
Organisation Biotechnology and Biological Sciences Research Council (BBSRC) 
Sector Public
Country United Kingdom
Start 10/2018 
End 02/2019
 
Description Edinburgh Global Research and Partnership Award
Amount £1,950 (GBP)
Organisation University of Edinburgh 
Sector Academic/University
Country United Kingdom
Start 03/2018 
End 03/2018
 
Title Photomeric stereo dataset with annontated leaf masks. 
Description Dataset derived from PS-Plant for training image-based neural networks 
Type Of Material Biological samples 
Year Produced 2018 
Provided To Others? No  
Impact Not yet known 
URL https://datashare.is.ed.ac.uk/handle/10283/3200
 
Title iDIEL Plant software and image capture system design. 
Description Software and source code for a "Do-It-Yourself" phenotyping system: measuring growth and morphology throughout the diel cycle in rosette shaped plants. Plant Methods 13:95. 
Type Of Material Biological samples 
Year Produced 2017 
Provided To Others? Yes  
Impact The outcomes of this research initiated several discussions with potential GCRF collaborators in Africa for technology transfer. 
URL http://mccormick.bio.ed.ac.uk/software
 
Title Supporting data for "A photometric stereo-based 3D imaging system using computer vision and deep learning for tracking plant growth" 
Description Tracking and predicting the growth performance of plants in different environments is critical for future crop development, which is under dual pressure from population expansion and global climate change. Automated approaches for image capture and analysis have allowed for substantial increases in the throughput of quantitative growth trait measurements compared to manual assessments. Recent work has focused on adopting computer vision and machine learning approaches to improve the accuracy of automated plant phenotyping. Here we present PS-Plant, a low-cost and portable 3D plant phenotyping platform based on an imaging technique novel to plant phenotyping called photometric stereo (PS). We calibrated PS-Plant to track the model plant Arabidopsis thaliana throughout the day-night (diel) cycle and investigated growth architecture under a variety of conditions to illustrate the dramatic effect of the environment on plant phenotype. We developed bespoke computer vision algorithms and assessed available deep neural network architectures to automate the segmentation of rosettes and individual leaves, and extract basic and more advanced traits from PS-derived data, including the tracking of 3D plant growth and diel leaf hyponastic movement. Furthermore, we have produced the first PS data set, which includes 221 manually annotated Arabidopsis rosettes that were used for training and data analysis (1768 images in total). PS-Plant is a powerful new phenotyping tool for plant research that provides robust data at high temporal and spatial resolutions. The system is well-suited for small and large-scale research and will help to accelerate bridging of the phenotype-to-genotype gap. 
Type Of Material Database/Collection of data 
Year Produced 2019 
Provided To Others? Yes  
 
Description Stereoscopic camera array system - BRL, Centre for Machine Vision - UWE Bristol 
Organisation University of the West of England
Department Psychology
Country United Kingdom 
Sector Academic/University 
PI Contribution Currently collaborating on the construction and software development of a stereoscopic camera array system for capturing real-time 3D images of plants with the BRL, Centre for Machine Vision (CMV) (Smith Lab).
Collaborator Contribution The CMV has designed and built the rigs, and delivered them to Edinburgh for testing. We collaborate closely with the CMV for building software to extract growth traits.
Impact The project commenced in Oct 2016. Thus far the rigs have been assembled and data is currently being generated and analysed. This is a multidisciplinary collaboration between computer scientists (CMV), biologists and engineers (Ed).
Start Year 2016
 
Description Stereoscopic camera array system - Edinburgh, Eng 
Organisation University of Edinburgh
Department School of Biological Sciences
Country United Kingdom 
Sector Academic/University 
PI Contribution Currently collaborating on the construction and software development of a stereoscopic camera array system for capturing real-time 3D images of plants with UoE School of Engineering (Tsaftaris group).
Collaborator Contribution The Tsaftaris group brings specific expertise in machine learning and algorithm development for Arabidopsis plants. McCormick co-supervises a PhD student in the Tsaftaris group. We have shared access of data, protocols and biological materials between labs.
Impact The project commenced in Oct 2016. Thus far the rigs have been assembled and data is currently being generated and analysed. This is a multidisciplinary collaboration between computer scientists (CMV), biologists and engineers (Ed).
Start Year 2016
 
Description Stereoscopic camera array system - Edinburgh, SBS 
Organisation University of Edinburgh
Country United Kingdom 
Sector Academic/University 
PI Contribution Currently collaborating on the construction and software development of a stereoscopic camera array system for capturing real-time 3D images of plants with UoE School of Engineering (Tsaftaris group).
Collaborator Contribution The Halliday group brings specific expertise in shade avoidance Arabidopsis mutants (Phy mutants) - the primary targets for data generation in this project. McCormick co-supervises a PhD student in the Halliday group. We have shared access of data, protocols and biological materials between labs.
Impact The project commenced in Oct 2016. Thus far the rigs have been assembled and data is currently being generated and analysed. This is a multidisciplinary collaboration between computer scientists (CMV), biologists and engineers (Ed).
Start Year 2015
 
Title Plant Magic software for leaf rosette area analysis 
Description Plant Magic is a software tool for analysing images of plants and calculating the leaf/rosette area under both night and day conditions. An associated hardware tool is used for night imaging. 
Type Of Technology New/Improved Technique/Technology 
Year Produced 2016 
Impact The software was developed by a Masters student (Andrei Dobrescu). Andrei has now been accepted as a PhD student co-supervised by McCormick and Sotirios Tsaftaris (UoE, Eng) to continue work on plant image analysis. The software is currently used by several labs in UoE and will shortly be released on the open Plant Image Analysis website (http://www.plant-image-analysis.org/). An additional hardware tool (a low cost far-red LED array with a raspberry Pi camera) was developed to capture images throughout the diel cycle. This method for building this setup is being prepared for publication. 
 
Description Festival of Learning - virtual reality Rubisco 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Undergraduate students
Results and Impact We applied our outreach tool developed in 2017 to reach a wider audience through the UoE Festival of Learning week. The virtual reality game teaches about the importance and shortcomings of Rubisco, the primary carboxylating enzyme in photosynthetic organisms. Gamification of Rubisco catalysis allowed engagement with the public on the importance of Rubisco and to discuss relatively complex biochemistry.
Year(s) Of Engagement Activity 2017,2018
URL http://www.festivalofcreativelearning.ed.ac.uk/event/photosynthesis-virtual-reality
 
Description GCRF One Planet Workshop at ILRI-BecA (Kenya). 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Attended the GCRF One Planet Workshop at ILRI BecA in Kenya. This was an international meeting attended be researcher and business leaders from over ten countries to discuss opportunities for high impact research collaboration.
Year(s) Of Engagement Activity 2017
 
Description International Institute of Tropical Agriculture (IITA) workshop (Zambia) 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Professional Practitioners
Results and Impact Soybean is a principal food crop in sub-Saharan African countries, such as Zambia, Malawi and Mozambique. Current soybean varieties consistently underperform, which continues to impact negatively on sustainable food security. We discussed potential GCRF-associated collaboration potential with the IITA for adapting the digital imaging tool developed in BB/N02334X/1 to provide a step change in precision breeding capacity in the Zambian soybean breeding community.
Year(s) Of Engagement Activity 2018
 
Description KTN Emerging Imaging Technologies in Agri-Food Workshop in Birmingham 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Industry/Business
Results and Impact Wenhao Zhang presented our work on using machine vision / learning in agri-tech at the KTN Emerging Imaging Technologies in Agri-Food Workshop in Birmingham
Year(s) Of Engagement Activity 2018
URL https://ktn-uk.co.uk/news/emerging-imaging-technologies-in-agri-food
 
Description Midlothian Science Festival - virtual reality Rubisco 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Public/other audiences
Results and Impact We developed a new, high-impact educational tool and participated in outreach for young school children (4-12) and their parents during the Scotland Midlothian science festival (http://midlothiansciencefestival.com/). Their objective was to teach the public about the importance and shortcomings of Rubisco, the primary carboxylating enzyme in photosynthetic organisms. Gamification of Rubisco catalysis allowed engagement with the public on the importance of Rubisco and to discuss relatively complex biochemistry, even with four year olds!
Year(s) Of Engagement Activity 2016
URL http://mccormick.bio.ed.ac.uk/amccormi/sites/sbsweb2.bio.ed.ac.uk.amccormi/files/MSF%202016%20Collag...
 
Description N8 Precision Ags and Robotics Doctoral Training Seminar 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Postgraduate students
Results and Impact A keynote talk was delivered by Wenhao Zhang at the N8 doctoral training seminar at Manchester on agri-robotics and automation.
Year(s) Of Engagement Activity 2018
URL https://www.n8research.org.uk/
 
Description Participation in an activity, workshop or similar - RSB Science Week - virtual reality Rubisco 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach Regional
Primary Audience Schools
Results and Impact We applied for funding from the RSB to host a co-ordinated exhibition at the Midlothian Science Festival during RSB Science Week in 2018, using our VR outreach tool developed in 2017 . The virtual reality game teaches about the importance and shortcomings of Rubisco, the primary carboxylating enzyme in photosynthetic organisms. Gamification of Rubisco catalysis allowed engagement with the public on the importance of Rubisco and to discuss relatively complex biochemistry.
Year(s) Of Engagement Activity 2018
URL https://www.rsb.org.uk/get-involved/biologyweek
 
Description Virtual presentation to UK-RAS Strategic Task group in Agri-Robotics 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact Melvyn Smith to give a virtual presentation to UK-RAS Strategic Task group in Agri-Robotics that included our work on automated weed detection, cattle condition monitoring and pig face recognition and expression detection on 29th Sept.
Year(s) Of Engagement Activity 2020
 
Description White paper - Agricultural Robotics: The Future of Robotic Agriculture 
Form Of Engagement Activity A formal working group, expert panel or dialogue
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Policymakers/politicians
Results and Impact UK-RAS Network White Papers, ISSN 2398-4414
Year(s) Of Engagement Activity 2018
URL https://arxiv.org/abs/1806.06762