We are witnessing a dramatic increase in extreme weather events threatening the electric grid. Studies show extreme weather events are increasing in frequency and 83% of utility companies expect high-impact extreme weather to affect future grid stability. This has the potential to have a devastating effect as the electrification of transport, increased industrial dependency and smart household appliances is transforming the electric grid into a single point of failure across business, home and transport.
We have recently seen Hurricane Laura take out 219 transmission lines and 292 substations across the US while Storm Uri cost an estimated $130 billion and left 4 million people without power in Texas. With such a dynamic operational landscape, power grid operators need a consolidated real-time overview of the vulnerabilities and hazards in their networks and the ability to assess and respond quickly when disaster strikes.
Yet central oversight of power grids is being reduced due to the growing fragmentation of renewable electricity generation. Initiatives such as the UN 2030 Agenda for Sustainable Development, and the EU Framework 2030, are helping to accelerate ‘distributed generation’. Utilities are getting behind this effort by sponsoring community solar projects, with examples such as Sacramento Municipal Utility District’s SolarShares, and the Bright Tucson Community Solar Program. ‘Interconnection’, which allows small-scale renewable energy projects to connect to the electric grid, will mean that national grids increasingly draw on these local, community-run power sources. We are similarly decentralizing energy storage through technologies such as vehicle-to-grid which transform homes and cars into batteries. Experts predict that renewably powered grids will need to draw on many local power sources to remain agile and resilient against hazards.
A little-known consequence of the de-centralization of energy is the fragmentation of network data traditionally used to monitor and manage electric grids. Utilities are increasingly implementing Internet of Things (IoT) networks, built-in processing, connectivity, and sensing capabilities across grids to create a digital twin of their physical network. Yet this abundant array of data sources is now fragmented across an exponentially expanding array of power sources. Data on everything from as-builts and repairs to damage and degradation was already difficult to capture and is now made even more challenging with our new network realities. With generation and storage capacity splintered among far more people and places, ‘single points of failure’ are increasingly problematic to identify and mitigate.
For example, Texas prides itself on a diverse and de-centralized energy grid including abundant solar power and America’s largest wind energy production capacity. Yet Storm Uri exposed a ‘single point of failure’ across interconnected power sources freezing power and distribution systems from wind turbines to gas pipes. The move away from centralized generating capacity towards local power sources means that what happens in one place increasingly matters everywhere else.
One of the possible causes was the failure to learn lessons from the last deep freeze a decade earlier which demonstrated that the grid was vulnerable to cold weather. Utilities need to collectively absorb and apply lessons across all parts of the network. However, operators struggle to institutionalize and implement vital insights from networks because network assets are often still represented in non-interactive ways, or the data is siloed, which means it can be inaccurate or impossible to rapidly collate. We have worked with utility companies where as many as 50% of their as-builts are incorrect and records of damage or degradation are similarly out-of-date. Some power companies still use paper-based network maps, Excel spreadsheets, or apps that cannot integrate live data from all local sources. These are relics of centralized fossil fuel grids and cannot easily incorporate live data from myriad local sources.
Japanese utilities have extensive experience of contending with extreme weather events from earthquakes to typhoons. One power giant TEPCO has responded by creating a geospatial digital twin of their network that is user-friendly, open, and accessible to every mobile device or web browser so it can be rapidly updated by workers in the field. Their mobile geospatial strategy mimics the decentralized nature of the modern grid, creating a comprehensive and current overview of utility grid damage and hazards. It can be instantly populated with details of new builds, repairs, or upgrades as well as damage, degradation, or risks. When Typhoon Faxai damaged their network, the system allowed operators to rapidly view and update critical network information, blackout locations, and damage in any location.
This helps target resources efficiently and effectively and incorporates valuable insights from damage to one asset that could inform strategies to protect other assets. In Texas, this kind of integrated, updated network data could have helped apply the lessons from previous winters to model the impact of future freezes and implement strategies such as reserve margins, improving insulation, or heating pipes. Ultimately, geospatial network data can be integrated with other datasets such as local hazard or weather trend data to create proactive and ‘predictive’ grids continually anticipating and averting hazards before they arise.
The fragmentation of power generation means that utilities now face a more distributed, diverse threat landscape. They must respond by creating an equally distributed and many-rooted geospatial digital twin of their networks.
Jay Cadman is the SVP Enterprise, at IQGeo.