Skip to content
menu icon

GRDC Websites

Artificial intelligence to provide fast, accurate early disease warning

A Western Australian project uses field data gathered by drone to train machine learning based models to detect crop stress.
Photo: Mohammed Bennamoun

Western Australian researchers are exploring the use of remote sensing data and deep learning systems to provide earlier detection of stress and disease in crops.

In a GRDC-invested project, a multidisciplinary research team at the University of Western Australia (UWA) aims to provide grain growers with tools to assess the impact of frost, salt stress and diseases such as Fusarium head blight so they can treat them faster and more profitably. The project builds on existing collaborations in machine learning, genomics, pathology and plant breeding.

The technology uses multispectral and hyperspectral field data – including thermal, ultraviolet and infrared light – which has been gathered mainly by hyperspectral cameras mounted on drones. Images of crops undergoing a known degree of stress are used to train machine learning based models to detect this stress in future images gathered on-farm.

“Detecting stress early, before the visible symptoms appear, in a rapid and non-destructive way is possible because of the difference in the spectral response of healthy and stressed plants,” UWA Professor Mohammed Bennamoun explains.

“The value of this technology is high as existing methods still mostly rely on manually examining crops for visible indicators of stress or disease.

“For large commercial crops, such an approach is simply not practical. It also assumes that there are clear visible symptoms, which often only occur at middle or late stages of stress or infection.”

Professor Bennamoun notes that identifying plant stress and disease at a much earlier stage would enable better crop and system management, with significant financial and environmental benefits.

It would require replacing the existing, largely manual, crop inspection process with automated and objective measurement tools.

The value of this technology is high as existing methods still mostly rely on manually examining crops for visible indicators of stress or disease.

The team has collected a number of datasets for fieldplant stress from project collaborators. For example, a frost dataset was prepared by the Agricultural Research Federation to provide a framework to share and integrate agricultural data between growers, agronomists and researchers.

The team developed a number of deep learning approaches (within a framework that they named DeepCrop) and reported experiments and state-of-the-art performance on, for example, wheat Fusarium head blight (biotic stress) and four wheat salt stress (abiotic stress) datasets.

In simple terms, Professor Bennamoun says the spectral data is fed into a “black box”, which analyses it and presents the outcomes in an understandable form. Also, deep learning systems mimic the working of the human brain in that the system learns from the data.

The deep learning system learns the optimum way to perform a task, whether it is early detection of disease or of stress.

Professor Bennamoun says the team benefited from access to national infrastructure networks to assist with its work, including graphics processing units which accelerate the processing of images.

Explainable AI

A significant part of the project was to produce outcomes which could be readily understood and assessed by growers – what the researchers call “explainable AI”.

“Our goal was to provide some explanation of what was going on inside the ‘black box’, which comes from an area of theoretical work which looks at how the computer is learning,” Professor Bennamoun says.

“If we are able to provide some of that knowledge to the grower, they will then know the extent to which the deep learning model can be relied on. If, for example, we can show the spectral band where the system detected the stress and reached its decision, other experts will be able to corroborate the explanations.

“The idea is that the user will understand the decisions that led to a certain outcome so they can choose their response with some certainty.”

As an example, he says a grower could go to the predictive model – which has already been trained to detect, say, frost – and feed their images into the system. It will produce its output or decision. If it shows frost or disease damage the grower can decide how to treat it.

Deep learning

In recent years, deep neural networkshave received considerable interest because of their ability to provide highly accurate detection and prediction outputs. Deep learning is a sub-area of machine learning and AI that mimics the workings of the human brain (based on artificial neural networksin processing data for use in detecting objects, recognising speech, translating languages and making decisions.

The UWA team has exploited explainable AI to determine which spectral bands are the most relevant for identifying stressed plants. “With explainable AI, we can identify which wavelengths have a significant impact on the model’s decision.”

The team comprises experts in machine/deep learning, agronomy, applied genomics and bioinformatics. It includes Professor Bennamoun (computer vision and machine/deep learning), Professor Dave Edwards (crop genomics), Professor Farid Boussaid (smart sensors and machine learning), Dr Philipp Bayer (biological sciences), postdoctoral research fellows Dr Lian Xu and Dr Lin Wu, as well as PhD student Mrs Wijayanti Nurul Khotimah.

More information: Professor Mohammed Bennamoun, mohammed.bennamoun@uwa.edu.au

back to top