Photo courtesy of Southern California Edison.
Although utilities have digitized their analog records, the legacy grid data is not always consistent and manual processes are still required. Using machine learning to analyze available data can verify its accuracy and simplify the update process.
Although utilities have digitized their analog records, the legacy grid data is not always consistent and manual processes are still required. Using machine learning to analyze available data can verify its accuracy and simplify the update process.
Although utilities have digitized their analog records, the legacy grid data is not always consistent and manual processes are still required. Using machine learning to analyze available data can verify its accuracy and simplify the update process.
Although utilities have digitized their analog records, the legacy grid data is not always consistent and manual processes are still required. Using machine learning to analyze available data can verify its accuracy and simplify the update process.
Although utilities have digitized their analog records, the legacy grid data is not always consistent and manual processes are still required. Using machine learning to analyze available data can verify its accuracy and simplify the update process.

Tracking Decades-Old Power Poles with Machine Learning

Nov. 1, 2023
Utilities must organize and make sense of a tremendous tranche of legacy power grid data, but AI may offer answers.

Southern California Edison’s service area contains over 1.4 million power poles, some of which are decades old. Monitoring these assets and their conditions has been challenging. Although utilities have digitized their analog records, the legacy grid data is not always consistent and manual processes are still required. Using machine learning to analyze available data can verify its accuracy and simplify the update process.

As society transitions to a clean energy future, customers depend on electric utilities to keep the lights on and enable electric transportation, expand electric technologies, integrate energy from rooftop solar and batteries and more. This requires an increasingly digital automated grid. High-quality data is foundational to the digitization of the energy system, which is a crucial step to get to carbon neutrality. Utilities rely on millions of data points, some of which originate from equipment installed decades ago, to make risk-informed asset management decisions.

SCE’s electric asset data remediation tool leverages data science and machine learning to improve the accuracy and consistency of asset data for a more dynamic grid — drastically reducing the time it takes to validate data corrections from hours to minutes.

Remediation Tools

For decades, electric utilities kept records in paper-based systems. With the evolution of technology, most records are now digital, including data related to the physical location of poles and transformers. While digital systems often make work easier, converting legacy grid asset data has not always been seamless.

A digital approach is still limited by the dependency on a manual process, such as data entry, which can lead to errors. It also includes other manual processes, like cataloguing unstructured data sets (e.g., photos), confirming location accuracy of known assets, correcting recurring differences between field conditions and inventory records and detecting assets that require maintenance or repair.

Outdated or inaccurate data within the digital systems can lead to inefficiencies in grid planning and operations needed to ensure employee and public safety as well as system reliability. Manually managing digital asset data is also costly. In 2021, SCE performed electrical asset mapping corrections 22,000 times in high fire risk areas using field evidence.

Based on those learnings, SCE estimates that it would cost roughly $16 million and 300,000 worker hours to complete a service area-wide review and update of its more than 1.5 million overhead structures (approximately 300,000 of which require updates at an estimate of one hour of work per structure). A substantial part of the cost is associated with the time it takes to assess the accuracy of the location data found in photo evidence captured over multiple field visits.

For example, as part of this project, SCE found location accuracy of varying degrees with its overhead structures — while 50% of structures were within 10 meters, 30% were 10-30 meters and 20% were found greater than 30 meters of where legacy information listed the asset location.

Inaccurate asset locations can pose downstream safety risks because of increased time searching for an asset in the field or missing inspection and maintenance activities entirely. It can also lead to significant system challenges when inaccurate location data extends repair outages, including wire down, or increases the possibility of ignition from utility equipment in high-fire weather.

To manage digital asset data more efficiently, SCE looked to integrate high-quality data already being collected through other programs, e.g., high-resolution photos being taken during aerial inspection of assets in high fire risk areas.

In addition to safety benefits, by incorporating the data more efficiently and using an automated process, we estimate the opportunity to result in $8 million in savings, including approximately 170,000 worker hours, if applied to SCE’s entire service area.

The broader business challenge of assessing location data accuracy cannot be addressed without solving the technology challenge of automating the extraction of geolocation insights from high-resolution photos to detect mapping inaccuracies. To find a solution, our internal working groups went out to the industry to survey the pervasiveness of the challenge and to see if other utilities had solutions for mapped location accuracy.

Of those that were surveyed, none had developed novel automated solutions. Considering this research, the technical working group partnered with SCE’s Information Technology team to devise a solution, bringing together various groups with expertise in artificial intelligence (AI), geospatial analytics, image recognition, machine learning (ML), optical character recognition techniques and statistical methods.

The Solution

The cross-functional team successfully developed a user-friendly, innovative in-house application that automates the analysis of millions of high-resolution digital photos to find asset locations with corresponding confidence levels and employ that information to improve location accuracy. The team began with collecting images for wildfire risk and other field inspection purposes previously stored across separate repositories.

The challenge was not only to organize millions (petabytes) of images for easy access and fast retrieval. But to optimize the analysis of unstructured data using geospatial analytics and AI/ML techniques that take advantage of cloud-based computing power. This included fast search and retrieval of millions of images based on different criteria such as asset identifier, geospatial search and asset condition-based search.

The team developed a Data-as-a-Service (DaaS) visualization tool called the GRViewer to improve how subject matter experts from across the company accessed and evaluated images. With a DaaS approach, the team accessed petabytes of unstructured data on different cloud platforms.

DaaS enables visualization, analytics and AI/ML models execution with easy consumption, fast performance and high reliability. Simply put, DaaS has the same effect on big data that monetary currency had on bartering goods and services — DaaS allows for easy data interoperability across different software applications.With this powerful tool, the team built and trained advanced ML models to recognize key objects or features contained in photos, enabling them to perform a range of innovative functions, such as:
  • Accurately associating photos with their corresponding real-world coordinates to derive a structure’s true GPS location.
  • Effectively analyzing many photos captured from different time, devices, programs using geospatial analysis.
  • Analyzing multiple factors, including number of photos, capture time of photos, statistical significance of data distribution (mean, standard deviation, outliers),
    validation among multiple sources, etc.

The data remediation tool substantially reduces the need for manual processes to evaluate millions of photos and supplies reliable data that is as easy for users to curate as any search engine.

Human effort is needed for only a small percentage of assets the tool cannot validate with confidence. Even then, the remediation tool’s visualization feature streamlines the effort. As an example, any asset with a nadir image (the vantage point looking directly down on the top of a structural asset) has a high confidence latitude/longitude coordinate value.

For assets without a clear nadir image, the data remediation tool algorithm looks for other clues in collected photos for the next best available location data, such as a picture with a pole tag. The algorithm analyzes multiple images of a specific asset to corroborate the clues and arrive at an associated confidence factor.

A given asset may have hundreds of images collected through various inspections programs over the years. The computer model takes many of these photos and other factors into consideration when generating a projected location with an associated confidence score.

The data remediation tool helps expedite the process and analyzes assets across the entire service area to reconcile inaccuracies. This is a significant breakthrough, where traditional and manual efforts have proven so monumental — taking hundreds of thousands of scarce worker hours over multiple years to canvass the entire service area.

Improved User Experience

The capabilities of the data remediation tool introduce efficiency and improve the human touch points to manage and maintain a modern grid with safety, reliability, resiliency and cost savings in mind. The company’s mapping team can have the added confidence of making asset mapping corrections without having to spend research time validating location information across different datasets and systems.

As the team continues to expand and test this approach, it has promising asset data management applications for the entire industry.Photo courtesy of Southern California Edison.When more research is necessary, the team can efficiently confirm their findings by accessing this central repository of photographic evidence, enabling them to reach a prompt and informed conclusion. On a massive scale, the data remediation tool solves a similar problem that today’s television viewers experience navigating video content across dozens of content providers on their Roku or Apple TV. Just as these digital media players’ algorithms sift through user data to extrapolate and suggest shows the user might like, the remediation tool similarly sifts through utility photo data to predict equipment location.

Improved location accuracy within 10 meters of the actual structure reduces the treasure hunt inspectors face with “ghost pole” scenarios where they cannot find the equipment due for inspection in the vicinity of the listed coordinates. Utility personnel no longer need to find a needle in a haystack as they hike through varying and potentially unsafe terrain searching for an asset. Missed or delayed inspections will be significantly reduced.

With emerging outages, servicers dispatched to the vicinity of the outage will have increased confidence in knowing they are at the correct point of failed equipment without having to search or trace a pole line. This translates to reduced outage durations for customers. The remediation tool is also very scalable and forward looking because it is built on the DaaS model, which means a retooled and continuously improved data remediation tool can seamlessly access mountains of data (e.g., millions of high-resolution images) that continue to be collected daily. The remediation tool can also be adapted for future-use cases that might find value in this data lake.

Looking Ahead

SCE’s data remediation tool allows the company to confirm and correct large volumes of data at a fraction of the time and cost of manual processes. This frees up valuable resources to perform other high-priority work and, most importantly, enables the utility to use accurate data to mitigate safety and reliability risks faster. Further development of machine learning on image recognition and LiDAR (point cloud) data will expand SCE’s capabilities to inventory assets and detect visual health indicators, such as woodpecker damage to poles and rust on transformers, for a broader set of electrical assets.

As the team continues to expand and test this approach, it has promising asset data management applications for the entire industry. Having excellent data quality improves efficacy in grid planning and operations. Whether adapting to climate change and the perils of more extreme weather or creating the pathway to a more electrified economy, complex data analysis requires a solution like the electric asset data remediation tool.

In April, the Edison Electric Institute (EEI) named SCE one of five U.S. and three international electric companies as finalists for the 2023 Edison Award, which is presented annually to electric companies for their distinguished leadership, innovation and contribution to the advancement of the electric power industry. The data remediation tool was submitted as a case study for the 95th annual industry award.

Our team’s ability to leverage new technologies like machine learning and AI in new ways is putting us in a better position to ensure that the grid is reliable, resilient and ready for a clean energy future.

Noe Bargas, P.E., is a principal manager of Asset Data & Information Strategy at Southern California Edison. He is in his 17th year at the California utility. Bargas
graduated from Cal Poly Pomona with a bachelor’s degree in Civil Engineering.

Voice your opinion!

To join the conversation, and become an exclusive member of T&D World, create an account today!