Decisions around nitrogen application are a major determinant of productivity and profitability. As such, they are an important target for innovation.
Among the strategies developed to aid decision-making is the use of precision agriculture (PA) to deliver on the ‘Four Rs’: putting the right amount of the right product in the right place at the right time. However, this approach requires processing and analysing data from different sources and in complex ways.
Several digital nitrogen management tools have been developed over the years, but they required making assumptions that have tended to lessen grower confidence in these aids.
A GRDC investment was used to compare, assess and improve on available nitrogen management technology. This was the Future Farm initiative that was built around three major themes:
- sensing systems and data acquisition;
- a method for nitrogen rate prescriptions that combines sensor-derived data with other sources of information, such as historic production results, seasonal weather and financial or market signals; and
- autonomous operational capability, including a robotic platform, to action the output of the sensing and decision-making system.
With the first two phases (which involved a review and field-based research) now complete, outputs are being commercialised in the third phase.
This work was initially led by Dr Rob Bramley and is now led by Dr Roger Lawes (both from CSIRO) in partnership with PCT AgCloud (a PA software and analytics services company) and with scientific support from the University of Sydney.
Simplification challenges
Dr Bramley explains that the conventional approach to nitrogen decisions uses a mechanistic agronomic knowledge of crop production, such as charts based on nutrient balances or generalised fertiliser response curves.
However, the implementation of these often ends up being based on a simplistic ‘rule of thumb’ translation of agronomic knowledge.
More advanced decision support systems (DSS) based on crop models (such as Yield Prophet™ and its parent, APSIM) can be very ‘data hungry’. They can also be reliant on aspects that are difficult and expensive to measure or are spatially variable; for example, soil water availability and soil nitrogen status.
Consequently, DSS systems are often used with a ‘best guess’ set of input parameters – for example, using soil properties from a ‘nearby’ soil profile, which might actually be some distance from the paddock of interest. This uncertainty has resulted in low grower confidence, with a survey finding that only 26 per cent of growers use a DSS to make nitrogen fertiliser decisions.
The advent of remote and proximal sensing technology created opportunities to improve DSS performance, but sensors need to be calibrated. While calibration for some of this technology is straightforward – such as yield monitors and soil pH sensors – others are not. As a result, none of the traditional DSS systems, such as Yield Prophet, take sensor data as input.
Difficulties around calibration become especially acute when sensors make surrogate measurements that require analytics – including machine learning approaches – to derive the input data.
For example, the commonly used normalised difference vegetation index (NDVI) – which can be obtained through remote and proximal multispectral crop sensing – is a surrogate measure of photosynthetically active biomass, which relates closely to the size and health of the crop canopy.
“When it comes to surrogate measures, it is more accurate to describe these as ‘predictions of attributes’ rather than as tools that can be calibrated,” Dr Bramley says.
“Despite what you might hear, NDVI is not a measure of plant nitrogen status, although under some circumstances it might be correlated with it, and so can be used to predict it. Such predictions, however, are subject to error and site or seasonal variation.”
The complexity introduced by variability highlights a final issue. Traditional nitrogen recommendations tend to simplify complex agronomic interactions that change over time. As such, most common approaches have not been designed for the accuracy expected for precision nutrient management using variable-rate application.
Future Farm
An alternative approach becomes possible on moving away from sensor-based prediction of single crop attributes to a multivariate approach that recognises numerous variables interacting in a paddock.
Dr Lawes explains that a multivariate approach means using data from many sensors to drive a model-based strategy.
Importantly, it also requires on-farm experimentation to help target decisions to the site of implementation.
“In this case, ‘nitrogen strips’ are designed to test the yield and protein response to applied nitrogen,” Dr Lawes says. “This is an essential component of optimising decision-making because every farm and field is different.”
This approach was undertaken within the Future Farm initiative. Nine large-scale trials were spread across South Australia, Western Australia, Victoria and New South Wales starting in 2018. This generated more than 1500 observations of crop response to nitrogen application.
The nitrogen rate that maximises partial profit – the economic optimum nitrogen rate (EONR) – was then determined at three different management scales: site, management class and whole paddock.
The EONR was regarded as the ultimate nitrogen application rate against which other methods were compared. Several simple methods using indices such as NDVI were also included for comparison.
The analysis found that as accuracy in prediction increases, partial profit increases, but the rate of increase diminishes as the methods become more accurate.
Data-driven versus grower expertise
The growers collaborating in Future Farm were recognised as being skilled at optimising fertiliser use. They had very little yield gap and were confident in their nitrogen decision-making.
The best Future Farm method is the data-driven model with a full dataset (‘DD data abundance’) at a site-specific level.
At this scale, the data-driven model produced an increase of five per cent over the management by the collaborating growers. Dr Lawes says a five per cent improvement can reasonably translate into a $50 per hectare increase. Over a 2000-hectare cropping program, that amounts to a $100,000 annual gain.
The analysis also found that the use of this method at whole-paddock scale leads to a one per cent improvement compared to grower practice.
The results showed there was a relatively low value in a single sensor approach or a data-driven approach that had limited data input.
“Overall, it appeared that the only way to improve the accuracy and profitability of fertiliser decisions is by increasing the spatial resolution from whole-paddock scale to management zones or, better still, continuous variable rate,” Dr Lawes says.
A detailed description of the field-based work in phase two can be found in a publication from the Future Farm team available at here.
Commercialisation and a new way forward
The researchers are now confident that the key element for acquiring the needed spatial resolution is the use of automated on-farm experimentation, such as the strip trials used in this project.
“There is no impediment to strip trials being scaled-out across the country and seamlessly implemented by growers every season,” Dr Lawes says. “There is also scope to build the required datasets at field or farm scale, among groups of neighbours in a region.”
The training aspect of Future Farm’s data-driven decision method is being implemented by PCT Agcloud.
Managing director Andrew Smart says the company was launched as a precision agriculture consulting firm in 2001. Today, it also includes product development in collaboration with research organisations. It has developed connections and platforms with a range of commercial partners, which Mr Smart says gives new products the best chance of being commercialised quickly and cost-effectively.
He adds that the goal is to overcome the limitations and error potential associated with conventional mechanistic approaches to deriving fertiliser recommendations, such as:
- oversimplification of reality in models;
- an associated lack of valuable input data;
- spatial variability; and
- inability to predict the weather.
“To do this well, we need to consider as many potentially useful data sources as possible, both on-farm and off-farm, and underpin our process with on-farm experimentation,” Mr Smart says.
“The current commercialisation project is seeking to package these attributes in a usable form that growers can have confidence in.”
The commercialisation process has three core activities:
- re-writing the Future Farm analytical code in a format that enables easy integration into the PCT platform;
- establishing a large number of field trials by PCT’s commercial clients, to establish a database of trial results; and
- analysing results from phase three trials.
“The objective is to identify a method that performs at all management scales and points to a future where farm businesses that collect, maintain and even share production response and resource data will push closer towards achieving season and site-specific optimums,” Dr Lawes says.
The lessons learnt from Future Farm about the strategic benefits of on-farm experimentation are being applied in other GRDC projects modelling the impacts of residual herbicides, developing prescriptions for variable-rate herbicide applications based on soil layers, and other PA innovations under development in agronomy and crop protection.
More information: Roger Lawes, roger.lawes@csiro.au