ipopba/iStock/Getty Images Plus
659da9a679f22e001dc2b78d Gettyimages1165051903

Smart Grid’s Big Data And Granularity

Jan. 11, 2024
As the digitalization of the power grid grows, data optimization gets more complicated.

As 2023 ended, it was time to reflect. In the technological world, the staggering amount of big-data being produced by digital technologies made headlines. Globally, it was said the worldwide of new data production was over 120 zettabytes (1 zettabyte equals a trillion gigabytes) in 2023. They went on to guesstimate that over 181 zettabytes of data will be created in 2025. If you look back at the last 15 years, each year represents a steady progression from about 2 zettabytes to the 2025 estimate.

Each year’s figures are astonishing, but combining all the amounts of data generated are overwhelming, and leaves one big question. How can businesses manage all of these zettabytes in data storage and those that continue to be produced? The simple answer is big-data analytics. Fortune Business Insights reports, “The data market was valued at USD 271.83 billion in 2022. It’s projected to grow from USD 307.52 billion in 2023 to USD 745.15 billion by 2030.”

Big-data analytic technologies are focused around advanced software and artificial intelligence (AI) combined with various forms of data manipulation developed to examine the databases. They organize the raw data and analyze it. They’re looking for insights within hidden patterns, varying trends, collations, and others. The goal is looking at past organized data, and combining it with present data to make better decisions without guessing.

Data Impasse

The use of these sophisticated analytic tools for data management has improved the reliability and resilience of the smart grid technologies imbedded in the power delivery system, but there are issues. Big-data needs to be compatible and warehoused to make it available throughout the enterprise. Energy management systems need information from across the enterprise. It may need data from the advanced metering infrastructure, or IEDs (intelligent electronic devices) in the field, or perhaps from an integrated asset management system.

How are these big-data’s problems addressed? By using granularity rather than using methods to generalize or summarize the data. Granularity along with advanced technologies like cloud-based computing and AI are taming the explosive growth of the massive influx of big-data, but what is this granularity? Think of data granularity as a way of selecting the degree of detail present in the data tailored into convenient proportions like weeks, days, hours, minutes, or seconds.

Sometimes Smaller is Better

Another issue with data is the fact that it’s mostly unstructured. Unfortunately unstructured data doesn’t fit into pre-defined data models. Data produced by IEDs, sensors or automated systems so common in the grid’s intelligent technologies are unstructured and are not always compatible with each other. In today’s digital world industries can’t afford not being able to work with all types of data.

That helps explain why that these markets are evolving to be more granular which is driving the need for enhanced analysis tools. This ability to process all data in any form from any part of the enterprise is being seen as an important driver in many industries. This approach is finding acceptance within the power delivery system too. As grid modernization advances, utilities need to access all of the various data, data-sets, and databases no matter format or grouping.

Understanding the Data

Energy demands change rapidly, and the network must respond quickly. The markets are evolving to be more granular which is driving the need for enhanced analysis tools. Seeing this need, Hitachi Energy and Google Cloud have signed an agreement to co-create multiple sophisticated software products. This would be a good point to talk with the experts working with this cutting-edge technology and tap into their understanding. “Charging Ahead” contacted Michael Hinton, Hitachi Energy’s Global Head of Energy Portfolio Management and Bret Toplyn, Hitachi Energy’s Director of Product Management to discuss how leveraging granular data will improve energy transition.

Hinton started off the discussion saying, “Power is a unique commodity in the fact it must be used as quickly as it is generated. Supply and demand have to be kept in balance, but when supply and demand are out of phase it produces significant price changes. It’s a volatile market with huge price swings even though it has moved to sub-hourly time intervals. The time granularity has gotten very small, down to 5-minute increments, and it has gotten more complicated than using only time.”

He continued, “Sub-hourly pricing wasn’t enough. They also needed the cost associated with transmission and distribution networks used to move the electricity from where it is produced to where it will be consumed. It was a case of the more information the marketplace had the more they found they needed. Hitachi Energy and Google Cloud join together to collaborate on a tool set designed to mine that information.”

Hinton explained, “Hitachi Energy’s “Velocity Suite” has about 23 years of geospatial and asset data of substation nodes along with the networks between those nodes in North America. It also has pricing components associated with these data-sets and when it’s combined, it’s an enormous amount of data. Google’s cloud-based technologies add the ability to manage large amounts of data fast and efficiently. Google Cloud’s advanced data analytics and generative AI provides a fast comprehensive analysis of the data. Its query functions are designed to quickly mine data and bring the information back fast for the customer to work with.”

Toplyn pointed out, “Granularity is exponentially impacting the problem because it can reference pricing, time, and location databases for a more complete view of the energy information. The pricing component consists of three segments, cost of energy, a congestion factor, and actual power losses associated with pathway. This data not only presents pricing information but also points out problem areas affecting prices. It can also be utilized for planning and operating purposes. When historical 5-minute power increments are used with geospatially like substation nodes, the points of congestion become readily apparent.”

Toplyn continued, “They can be avoided or corrected. It’s an efficient way to identify zones for investment or areas where something unseen may have happened such as a power line down. This approach can recognize larger issues, such as too much renewable generation in regions with insufficient transmission. It really increases the understanding of where more investment is necessary or how operating procedures need to be revised to work differently.”

Hinton explained, “High prices may be caused by congestion, or low prices may be an indication of too much generation. There may even be extenuating circumstances that affect the system that haven’t been evident. Velocity Suite provides more detail about what is actually taking place in a specific service territory. A large geographical area can be broken down into nodes for better visualization of operation. The critical information is available into what is happening on the system and how to react to it.” 
Hinton continued, “Velocity Suite Power Prices in the Google Cloud can extract data providing the customer a faster understanding of their assets from a planning and operational point of view. It’s the first cloud-based data management tool available from the Hitachi Energy Google Cloud partnership, but it will not be the last. There is a growing need in the marketplace for software solutions like this in the power delivery industry.” 

Better Decisions with Better Technology

As we have discussed, the digital technologies are producing a great deal of complex big-data data-sets each year. Managing that data efficiently is one of the most critical tasks facing industry. Utilities have more data available than ever before, and effectively managing it is necessary to get the full value out of the data with valuable insights that help make informed decisions.

Modern data management is easier, quicker, and more cost effective with big-data analytics, cloud-based computing, and advanced software. Still we need to be more adept at collecting, enhancing, and applying data to make better decisions, but that isn’t meant to be negative. We have come a long way in a short time when it comes to data administration. A few years ago companies could never have managed the enormous amounts of data we’re seeing in play within the enterprise.

Next year we’ll probably being looking back at how antiquated today’s state-of-the-art big-data analytics were compared to what would then be commonplace. The pace of technological advancement has to be mind-blowing when we see how fast we are producing zettabytes of big-data each year.

Imagine the quantum leaps that will be experienced when digital twin modeling technology is combined with the digital expertise discussed above and is integrated into automated energy management systems. What if it is included in advanced asset management platforms? Think of the infrastructure awareness that would produce and it would be in real-time. Streamlining operations, reducing risks in reliability while being able to lower required maintenance. It sounds a little like science fiction, but digital technology never advances sequentially, so there are no breadcrumbs to follow. Advancement is always exponential. That’s why it’s exciting!  
  

 

 

About the Author

Gene Wolf

Gene Wolf has been designing and building substations and other high technology facilities for over 32 years. He received his BSEE from Wichita State University. He received his MSEE from New Mexico State University. He is a registered professional engineer in the states of California and New Mexico. He started his career as a substation engineer for Kansas Gas and Electric, retired as the Principal Engineer of Stations for Public Service Company of New Mexico recently, and founded Lone Wolf Engineering, LLC an engineering consulting company.  

Gene is widely recognized as a technical leader in the electric power industry. Gene is a fellow of the IEEE. He is the former Chairman of the IEEE PES T&D Committee. He has held the position of the Chairman of the HVDC & FACTS Subcommittee and membership in many T&D working groups. Gene is also active in renewable energy. He sponsored the formation of the “Integration of Renewable Energy into the Transmission & Distribution Grids” subcommittee and the “Intelligent Grid Transmission and Distribution” subcommittee within the Transmission and Distribution committee.

Voice your opinion!

To join the conversation, and become an exclusive member of T&D World, create an account today!