There have been a number of studies, reports, and surveys about making the power grid more resilient lately, and most have one thing in common. They’re pushing for power grid modernization, and they recommend doing it by deploying more digital technology. Our power delivery system is one of the most asset intensive industries on the planet and this approach will add more assets to it. That isn’t bad, but digitalized assets come with more big-data. That isn’t bad either, although it can be an issue.
Unfortunately, today’s power delivery system produces gigantic amounts of big-data daily, and there appears to be no end in sight. Some authorities are saying all of this big-data has moved to a new category. They are calling it by a variety of names like next-gen big-data, bigger-data, industry 4.0 data, advanced big-data, and others. Let’s use the bigger-data term since it seems less confusing than all the others. After all, bigger-data is just super-sized big-data.
Bigger-data is so much larger in size that it needs more sophisticated tools. It can’t be managed efficiently by out-of-date big-data methods. Bigger-data requires the next-gen modern management tools with capabilities beyond those developed for the previous generation of big-data. These systems use amazing techniques like cloud computing, artificial intelligence (AI), machine learning and high-tech data analytics to produce usable intelligence. It’s almost like sorcery and makes us forget the solid data-science behind it.
Gigabytes, Terrabytes, Zettabytes
Is the increase in data really that big? Well, according to EarthWeb, around 79 zettabytes of data were generated in 2021 and they predict 180 zettabytes will be generated in 2025. To put it in a more familiar context, think about buying data storage. The big box stores sell external hard drives in terabytes as an off-the-shelf item. If you wanted to store a zettabyte of data you would need a billion terabytes hard drives.
That’s bigger-data and the experts are expecting it will continue to grow as our connectivity continues, brought about by the IT/OT convergence. The research firm Markets and Markets published a report last year that stated, The big-data industry is being driven by sharp increases in data volume. They are projecting the big-data market spending will grow from US$162.6 billion in 2021 to US$273.4 billion in 2026.
Backtracking a bit, IT/OT connectivity combines operational technology (OT) and informational technology (IT) under one platform or system. It has found a home with asset-intensive businesses like the power delivery industry. IT/OT with the wizardry IIoT (Industrial Internet of Things) merges the physical world with the virtual world through a comprehensive digital model. This digital model is a detailed representation of the enterprise.
The digital model is an essential element for the deployment of some amazing digital applications like digital twins. Digital models kicked asset management systems into their next-gen in time for bigger-data. Modern asset management systems were complicated platforms to begin with, but when performance was added between the words asset and management, nothing has been the same. Manufacturers such as ABB, Bentley, GE Digital, Hitachi Energy, IBM, Siemens Energy, SAP, Schneider Electric, and others have been developing a broad range of APM platforms with emphasis on a wide variety of capabilities and abilities.
Smart Grid Analytics
These asset performance management (APM) systems use AI, cybernetics, and machine learning to sift through all the historical and real-time data. But it doesn’t stop there, APMs can review previous loading data, maintenance documents, and weather records to develop assessments of the asset’s health. The APM combines all this information to determine the present condition of each asset in the enterprise. In other words, the APM system is tracking the health of an organization’s physical assets, and this is where it gets interesting.
Recently Charging Ahead had a chance to speak with a couple of colleagues. Gary Rackliffe, VP, Marketing Development and Innovation, and Bart Gaskey, senior vice president of Strategic Marketing and Business Development. They are Hitachi Energy’s experts on key emerging innovations in the utility landscape, among them APM and its developments. The discussion brought some interesting viewpoints about APM systems and how they fit into today’s power grid.
Gaskey began the discussion saying, “For many years, Utilities have been deploying sensors and monitors on their systems, and have been capturing enormous amounts of data. APM systems take advantage of smart grid analytics making sense of the data gathered. They also clean the data, and organize it so it can be used with existing data and the data coming from all the parts of the organization.”
Gaskey continued, “Hitachi Energy’s software experts have developed analytics software that has accelerated the pace of predictive forecasting through predictive modeling that uses online and offline data. The analytics have gone so far as to be more proscriptive proposing corrective actions as opposed to merely identifying changes from normal conditions. To be able to understand what’s going on with the network, you have to be able to understand the data and it has to be consistent throughout the company.”
At this point, Rackliffe explained, “Unintentionally organizations have constructed digital data silos throughout their company because no one considered data might be exchanged across the enterprise like it is today. So often departments acted on their own when it came to data and how it was used in their sector. Every system was designed specifically with its own needs in mind, but APM systems are not limited that way. They are designed with the entire organization in mind. They have access to both historical data and real-time data coming for assets in the field. Data is even coming from video and photographs taken during drone and helicopter flybys.”
Rackliffe continued, “These APM systems are being taught to identify the forces that impact the equipment’s health and its condition. Using supervised AI learning, the APM systems synchronize the data exchanged from all the data available to the enterprise. This is possible by taking advantage of AI’s machine learning capabilities. For known problems, the algorithm is trained to identify the signature of the problem, then it can quickly find that signature anytime it is found in the big data it is sifting through. Unsupervised AI learning enables the algorithm to recognize data anomalies in asset performance data. It’s looking for something different that stands out from normal or healthy conditions. Detecting a data anomaly can indicate that there is a problem, but you don’t know what the cause is or how performance will be impacted.”
Gaskey pointed out, “The transmission gird is being pushed harder every day and utilities need every advantage they can find. These APM systems are what we call” shovel ready” (i.e., they are available technology and not prototypes). They are being used by utilities to look at the health of their assets and the operational status of those assets and they are proving valuable.”
Gaskey explained, “Consider rolling a truck to a remote facility when an alarm comes in. First the operator checks the real-time data to determine what is going on. Is it an intruder, an animal, an equipment failure, etc.? If it’s determined to be an equipment failure, the trouble crew knows what material they need to take with them, and it’s put on the truck. Before anyone hits the road, personnel know exactly what they are going to find. They know what is needed to correct problem – no wasted windshield time.”
Rackliffe and Gaskey summarized, “Section 40107 of the historic U.S. Infrastructure Investment and Jobs Act (IIJA) encourages innovations through a variety of technologies. IIJA is making grant funding available for utilities to modernize energy systems. This program offers the utility a chance to apply some of these technologies like APM systems with a minimum risk. It’s like a kick-starter for modernization. One of the major categories in 407107 is for data analytics enabling grid functions, which means the utility would be able to apply some powerful technologies to solve specific problems they are dealing with in the operation of their system.”
The art and science of managing assets have come a long way with innovative APM systems. They are getting a reputation for improving the bottom line of the companies using them. According to GE, “APM is a proven approach to reducing unplanned downtime, decreasing maintenance costs, and reducing EH&S (environmental, health, and safety hazards) risks.”
APM platforms have moved beyond older asset management systems that simply gather data and filter it. These cutting-edge systems are real-time, end-to-end platforms utilizing predictive and prescriptive analytics over the entire enterprise database. They are capable of identifying problems, calculating the risks, and recommending the least risk solution for fixing or replacing the asset.
It's interesting that some utilities are taking a wait-and-see attitude. Then there are others who are committing the enterprise to APM. Both approaches are risky and uncomfortable. What’s in your future?