Better water quality to protect the Great Barrier Reef

06/02/2025

6 minutes

oceans and technology

In Australia, watercourses flowing into the Great Barrier Reef are closely monitored. The sensors used to monitor their quality are high-performance but may be subject to technical anomalies or environmental hazards, leading to biases in the data collected. Reducing errors to improve results and interpretation is crucial to the preservation of this unique ecosystem.

By Laurie Henry

Cover photo: The Great Barrier Reef © CC0 Public Domain

The Great Barrier Reef is a vital ecosystem, providing a natural habitat for marine species, acting as a filter between fresh and coastal marine waters and helping to stabilize coastlines. But it is also a vast area that is highly sensitive and vulnerable to land-based pollution arriving via rivers along the coastline. Agriculture, urbanization and extreme climatic events contribute to the discharge of sediments, pesticides and excess nutrients into these waterways, threatening marine biodiversity.

To monitor these pollutants, in situ sensors measure water quality in real time, but can be subject to a number of measurement uncertainties. That’s why a team led by scientists at Queensland University of Technology (QUT) has developed an advanced statistical method for identifying and correcting these errors, and proposed a model capable of distinguishing genuine environmental changes from measurement faults.

High-performance but sensitive sensors

Digital sensors installed in rivers and streams play a key role in monitoring water quality, particularly through their ability to detect variations in sediment, nutrients or pollutants that disrupt aquatic ecosystems. These devices often operate continuously, transmitting data in real time to scientists and water managers.

Opus TriOS optical water quality sensor and installation on Broadwater Creek in the Abergowrie Day Use Area/State Forest in the Herbert River (site 1,160,115) © Water Resources Research (2024)

However, despite their sophistication, these sensors are susceptible to a number of technical problems that affect measurement reliability. External elements such as rocks, organic debris or aquatic organisms can obstruct the lenses fitted to them, disrupting readings. Other factors, such as biofouling due to the accumulation of algae or bacteria, calibration errors or electronic malfunctions linked to battery problems, complicate data interpretation, and can lead to inappropriate environmental management decisions.

A method based on spatio-temporal analysis

To improve the reliability of the data collected by the sensors, the researchers have developed an advanced statistical model capable of automatically detecting anomalies in the measurements. This model is based on the analysis of spatial and temporal correlations of the readings, i.e. on a comparison of variations in water quality at a given point with those observed at other locations in the hydrographic network and at different times.

To do this, the researchers used statistical models to identify and correct sensor errors. Their method distinguishes genuine variations in water quality from technical anomalies, such as measurements that are suddenly too high or gradual data drift. To validate their approach, they carried out an in-depth study on the Herbert River, a Queensland river flowing into the Great Barrier Reef. They used a network of 10 Opus TriOS sensors, measuring water turbidity (amount of suspended sediment) and water level, at 30-minute intervals. Data collected between June 2021 and March 2022 were transmitted via the 4G-CATM1 mobile network to a centralized database.

A sub-catchment of the Lower Herbert River in Far North Queensland, Australia. The locations correspond to the ten sensors in the study. © Edgar Santos‐Fernandez et al., 2024

Comparing their results with manual surveys carried out by experts from the Queensland Department of Environment and Science, they demonstrated that their model significantly improved anomaly detection, correctly identifying over 93% of turbidity spike anomalies compared with around 81% for traditional approaches. The number of false alarms was also reduced with this new method, avoiding the reporting of natural events such as flash floods created by sensor errors.

By integrating these algorithms into a real-time monitoring system, the researchers were able to reduce the uncertainties and biases of conventional models, while reducing the need for human intervention to correct the data.

A concrete application for water management

Improved reliability of monitoring data enables more effective management of watercourses through better identification of pollution sources, whether agricultural run-off, industrial discharges or urban effluents. In concrete terms, in watersheds where intensive farming contributes to soil erosion, more accurate detection of turbidity peaks enables authorities to be alerted quickly and agricultural practices to be adapted accordingly, such as the installation of vegetated buffer zones or changes to spreading schedules. Similarly, in urban environments, this technology can help monitor the effects of stormwater on river quality and identify areas where water management infrastructures need to be reinforced.

The application of this approach is not limited to the simple detection of anomalies. By integrating these corrected data with hydrological and climatic models, researchers can refine their forecasts of medium- and long-term water quality trends. As a result, water managers can better anticipate critical periods, such as heavy rainfall events that encourage soil leaching, and take appropriate preventive measures.

Time series of total suspended solids (TSS) data and anomalies. Abnormal data points are colored according to eight types. NA refers to non-abnormal data. © Edgar Santos‐Fernandez et al., 2024

Moreover, by reducing the need for human intervention to correct data, this method optimizes the resources of environmental agencies and improves the speed of decision-making. “ Water quality challenges are increasingly pressing on a global scale. Our approach provides an essential tool for environmental monitoring and decision-making based on reliable data,” emphasizes Distinguished Professor Kerrie Mengersen, Director of QUT’s Data Science Centre.

A model for other environments

The effectiveness of the model developed by the researchers is not limited to the rivers feeding the Great Barrier Reef. Many regions of the world face the same challenges in terms of the reliability of water quality sensors, whether they be urban rivers subject to industrial and domestic discharges, large agricultural basins where pollutants are transported by runoff, or arid zones where water resource management is critical. In urban environments, this model could improve the detection of accidental contamination due to oil spills or wastewater leaks. In areas sensitive to climate change, it would enable more detailed analysis of changes in salinity, turbidity and dissolved oxygen levels, essential parameters for anticipating the effects of droughts and storms on aquatic ecosystems.

Integrating this method into international environmental monitoring networks would represent a significant step forward for the sustainable management of water resources. By coupling this statistical framework with other artificial intelligence tools, the researchers plan to further automate data analysis and optimize intervention strategies. In particular, this approach could be applied to the monitoring of the world’s major rivers, such as the Amazon or the Mississippi, where thousands of sensors record billions of data points every day. By standardizing the identification of anomalies and making measurements more reliable, this model offers a solution that can be adapted to the specific needs of each region. As pressure on water resources intensifies worldwide, its widespread use could become a major lever for improving water management and preserving the most vulnerable aquatic ecosystems.


Source : Edgar Santos‐Fernandez et al, “Unsupervised Anomaly Detection in Spatio‐Temporal Stream Network Sensor Data”, Water Resources Research (2024).

Read the publication

you might be interested in these events...... see everything