Under the U.S. Government’s “FAST Act” legislation of 2015, the U.S. Secretary of Energy Rick Perry has the authority to issue orders for emergency measures to protect or restore the reliability of critical electric infrastructure. As a result of these special powers granted by the legislation, the secretary and his team have decided that the U.S. government must better assess critical energy interdependencies.
The U.S. Department of Energy is now focused on developing a suite of reliability and resilience measures. Working with various national government stakeholders in the U.S., Canada, and Mexico, the Secretary decided to develop a North American modeling capability. The emphasis is being placed on monitoring the long-term resiliency of the electrical grid. One key aim is to identify infrastructure investment opportunities which will help improve resiliency and mitigate risks associated with energy systems interdependencies.
The basis of conventional software tools for the operation and planning of the electric power system were developed partly as a result of a major blackout that occurred in the United States in the 1960s. The goal was to take economic advantage of resource diversity and to optimize configuration of generator assets to deliver electricity at lowest cost to consumers. The key players at that time assumed that catastrophic events, whose probability was naturally low (“tails of the curve”), were not significant, and that correlation between events was limited. The primitive tools developed at that time were not designed to look across energy infrastructures.
However, with the emerging threat characteristics (well-organized; carefully planned; and organized) and the increased energy interdependencies (such as natural gas-electricity linkages), the use of standard contingency approaches (N-1; N-2) as a means to achieve reliability may no longer be sufficient.
The critical next step at the DOE is further refinement of the capabilities and collaboration with industry, all of the work being focused initially on the bulk power system. A new class of faster software tools is needed to fully assess emerging risks to reliability (including N-1-1 contingency analysis and energy interdependencies) and to develop robust reliability and resilience measures across the energy sector.
This type of technology advancement is being enabled by the availability of faster computation, new and efficient algorithms, and scalable data analytics and modeling approaches, leveraging the DOE’s investments in early-stage research at universities and network of U.S. National Laboratories (where the list includes Livermore Labs, Los Alamos Labs, Oak Ridge Labs, etc.).
Here are the three key areas of focus:
1. Mega-Watt Scale Grid Storage
OE will pursue the advancement of Mega-Watt scale storage capable of providing reactive and real power control for bulk and distribution power systems. The U.S. Department of Energy is moving forward with a program to investigate and integrate latest technologies to develop a strategic approach to rapidly progressing Mega-Watt scale storage that provides added resiliency and control capabilities.
2. Revolutionize Sensing Technology Utilization
Measuring and monitoring vital parameters throughout the electric power network is necessary to assess the health of the grid in real time, predict its behavior, and respond to events effectively. Lack of visibility and accurate device-level or facility-level information makes it difficult to operate the electricity system efficiently and has contributed to large-scale power disruptions and outages. Additionally, next-generation sensors will allow energy management systems, asset management and improve cybersecurity.
This technical area focuses on tools and strategies to determine the type, number, and placement of sensors to improve system visibility from individual devices to feeders, distribution systems, and regional transmission networks. This effort includes advanced methods to determine system states not directly accessible by measurement, and estimation methods for broad grid visibility. Some of the most innovative startups are developing machine-learning and correlation modeling in support of the electric sector.
3. Operational Strategy for Cyber and Physical Threats
Is it possible to develop a near-term actionable operational strategy to mitigate and eliminate potential cyber and physical threats? Any serious effort here must include the development of physical threat mitigation strategies through utilization of technology, design modifications, and operational considerations.
For the electric industry, the value proposition for big data analytics is just now emerging. Two questions are important to ask here: What is the problem being addressed? And what is the scale and urgency? Utilities and other market operators are facing an “explosion” of data coming from variety of sources: field measurements (smart meters, synchrophasors, smart sensors), weather measurements (ground stations, radar, satellite, specialized systems such as the National Lightning Detection network), asset monitoring (embedded sensors for condition-based monitoring), and other important sources for outage management (animal migration, vegetation, fire detection, water and gas management). Such data contains invaluable information to improve operations, planning, asset and outage management. But, of course, manual processing is not an option, due to time delays and sheer scale.
To process big data - as measured in terabytes, petabytes, and exabytes - the electric industry needs data analytics that extract knowledge required to create actionable decisions. New techniques are needed for data ingestion, cleansing and curation as well as data management, security and privacy. The industry has traditionally relied on data analytics to extract information, but additional analytics are needed to cope with extracting actionable knowledge. Doing it under increased data scale and uncertainty with a requirement for an extensive visualization reaching beyond what is traditionally encountered in data processing presents many challenges.
The major factors driving change here are still in a state of flux. The actual use of big data in the utility industry is in its infancy. Data analytics for information extraction, as used by the industry for many years, are often not suitable for handling large scale and diverse datasets that need to be integrated, and then processed quickly and efficiently – especially to allow for timely decision-making.
The key drivers for the change are many perceived opportunities:
- The benefits of Big Data analytics in operations, asset and outage management, and planning are evident due to new data sources not typically available in the past.
- The ability to handle the old and new problems more efficiently by using data-based models in addition to the physical model.
- An ability to offer much better visibility of the power grid operating conditions based on the improved fidelity and variety of Big Data.
- New cloud-based business services associated with data storage, processing, access management and utility applications.
There is a growing list of questions which will need to be addressed for any serious projects in this domain?
- What is the state of the art in big data analytics products and R&D developments suitable for power industry applications?
- What are the barriers to faster development and deployment of the solutions that utilize Big Data analytics?
- What are the priorities in terms of the utility applications that can benefit from deployments of Big Data analytics in data-intensive domains?
- What are the financial incentives to invest in Big Data analytics, and what are the expected returns on investment.?