Bretton Davis, Senior Digital Consultant, Arup, explains how utilities can truly unlock their data potential.

Throughout the water cycle, our industry has a robust history of collecting operational asset data. Where it matters, we are very good at sensing and monitoring our catchments, systems, and assets. What’s probably less clear is how to maximise the value from this data. This is an industry-wide challenge – how to unlock the full potential of this data for the benefit of our customers, the environment and our assets.

Consider digital twin as a methodology, not a technology

When you hear the term ‘digital twin’, your mind might typically visualise an AI & cloud server enabled hi-tech future. While this isn’t untrue, and these technologies are great catalysts for the adoption of this approach, you mind has already skipped over what digital twinning means and arrived immediately at what it thinks is the solution.

Digital twinning is about designing processes where a ‘physical’ asset works alongside a ‘digital’ asset as a twin. It’s the transfer of timely data from the physical to digital, and the return of actionable insight from the digital to the physical that makes the twin. The technology providing the ‘digital asset’ is specific to the problem you need to solve. AI is optional. The fastest way to unlock your data’s potential is by understanding what processes will benefit from twinning and why.

Start with the users. What do they need to enable their outcomes?

Unless you are starting with a fully automated system, somewhere in your twin cycle is an asset manager and/or operator as the twin cycle will not drive itself. This key factor for success is often undervalued.

Arup recently supported an organisation in the North East of England which needed to forecast water levels to create resilient operations on its sites. Knowing that forecasting the water levels doesn’t create resilience by itself, a user-based approach was considered. Who are the people taking actions and creating resilience, what do they need in the forecast? How should it be communicated? At what intervals? And by what means? If we do not focus on the user, we risk just generating more noise in the data pool.

Look at the bigger picture

Expand your system boundary: the key bit of data may be out in the catchment and not on your asset. Is there another data provider that could be providing valuable insight for your system?

While supporting a river water abstraction site for treatment to understand incoming nutrient loads from the environment, AI was being used to predict nitrate spikes, help the site be operationally ready and optimise performance in between.

Some things like rainfall and water level were obviously connected to the nitrate spikes experienced. But what else needed to be considered to get the AI to reliably predict this behaviour? Was it sewer overflow discharge? On testing this, the data science found the link to be weak. A strong correlation with the catchment soil moisture deficit was also observed. This made sense – the nitrate was running off the local agricultural fields which only discharged nitrate when specific precipitation and soil conditions aligned.

Bringing it all together on your asset base – Discover – Test – Implement – Repeat

Trying to take this challenge on can be a big task. A digital twin methodology that takes an incremental approach is recommended.

  • Discover – Explore your problem space. Where can we create value quickly?
  • Test – Test your assumptions. Is the solution feasible? Does it fit our operational processes?
  • Implement – Get your solution deployed, start generating benefit.
  • Repeat – Are the outcomes as expected? Where can we create value next? Whatever your operational challenge, adopting an incremental approach can help. This allows you to focus on generating value where it matters the most quickly in an agile way and build from there.