Water Sector Embracing Big Data
The water sector has collected reams of data for decades, but it’s only within the last few years that utilities, agencies, consultants and vendors have begun to use that data to improve everything from managing maintenance to predicting water flow.
The move to leverage digital information in the sector over the last two to three years is “drastic,” says Luis Casado, senior vice president of water for Gannett Fleming and one of several people who spoke passionately about the possibilities of water data at Water Environment Federation’s annual WEFTEC conference Oct. 1-3 in New Orleans.
Firms like Gannett Fleming, Arcadis, Brown and Caldwell, and Jacobs are taking previously underutilized information from supervisory control and data acquisition, or SCADA, systems, and pairing it with historic datasets and additional sensor data to create customized digital dashboards and applications for water agencies and related entities.
“It’s not a single piece of software, it’s an approach of how you look at data and how you merge that information and use it effectively in day-to-day operation,” said Kevin Stively, smart utility leader for Brown and Caldwell, in a presentation at the event. He said historical information can be layered on real-time information to help a younger workforce make the operational decisions that older workers relied on their “gut” to make.
The Jordan Valley Water Conservancy District in Utah, a wholesale water provider, asked Brown and Caldwell for a way to compile its control data and make it more accessible. Using Microsoft Power BI, the company created an interactive site that pulled together information such as water usage, copper and lead test results, customer complaints, infrastructure age and planned capital expenditures. The system lets operators control the water system from anywhere and from any device.
Multiple types of data can enable predictive analytics to alert operators when condition-based maintenance is needed, monitor water loss and manage problems with water quality in real time. For example, using predictive analytics, a utility can monitor dissolved oxygen in water systems and automatically make adjustments based on weather, water flows and temperature, Stively says.
Pilot studies show that utilities could save as much as 12% in operational costs by creating a “smart” water utility, according to Stively’s presentation.
Arcadis, for example, used already available data to create a green infrastructure screening tool to find favorable sites for facilities, down to the block level. Using the tool, Columbus, Ohio, has cut its projected costs for green infrastructure to $29 million, from $41 million.
Data from several similar assets can yield tweaks in designs or operations to make systems more efficient, says Fernando Pasquel, national director for stormwater and watershed management at Arcadis.
DC Water seeking to take the concept to the extreme and is working toward creating a digital twin of the Anacostia watershed. The digital twin could be created using things such as sensors and satellite imagery and could help make decisions about water quality and investments into the systems. According to experts at the conference, using a digital twin in such a way is five to 10 years in the future because of gaps in information and data processing capabilities,
A technological leap, with real-time data, is necessary for utilities to manage risk, says Casado. As recent tropical storms have demonstrated, relying on historical data only is “useless,” he says.
Digital maps and models also allow utilities to share information with the public so it can understand what is at risk. The public might not care about the amount of phosphorus being removed from stormwater, but if utilities can show on an interactive map how the quality of water can affect beach closures, customers begin to understand the challenges and possibilities, says Mark VanAuken, stormwater practice leader at Arcadis.
But despite the speed at which sensors are being deployed and data is being gathered, “We are still in the very beginning stages of big data,” says Sabu Paul, senior project manager at Atkins, who presented an overview on big data analytics for watershed management.
Dwayne Young, chief of water data for EPA, said that for all of the water data to be compiled and accessible across boundaries, utilities and agencies need to develop standards, a common language for hydrography and to find a way to make all of the data searchable.
The end goal, he says, is to connect all water information together, and to tell different parts of the “water story.”