National Grid customer care

Effective Communication

Nov. 4, 2013
During major weather events, National Grid takes customer care to a new level.

Since the beginning of the utility business, power systems have provided customers with light and power. These systems are built and designed to be safe, reliable and reasonably priced. But, like most things, they also need to evolve with the times. That presents considerable challenges for the industry.

For example, how does the industry ensure equipment can withstand severe storms and weather patterns? And how do utilities ensure customers are being communicated with effectively and continuously? In this age of 24-hour news, smartphones and instant access to information everywhere, the bar is set quite high.

It is not easy, and it is an ongoing journey. But this is where National Grid is now in terms of keeping all stakeholders  informed.

Getting It Quick, Getting It Right

It is no mystery, when power is off, customers want it back as soon as possible. They also want communication as service is restored. Their expectations for frequent, detailed information and reliability have never been higher. And, in an instant-gratification society, customers cannot and will not wait for innovations of the future to come to market.

One piece of this puzzle is to evolve from the power grid that Edison, Tesla and Westinghouse conceived to smart, resilient networks with robust communications capabilities. The problem is most existing networks still provide one-way power, and they do not offer constant communication.

Historically, restoration efforts have used whatever technology was available at the time. Paper-based systems were the norm until the 1990s. During the past few decades, utilities have invested heavily in outage management systems with computerized connected network models. Using advanced metering infrastructure, customer calls and system topology, utilities have more information available to speed restoration.

While these systems have brought many benefits, they still are tailored mostly to meet internal needs to manage an event response. In addition, most organizations have invested in systems that manage other aspects of their business, for example, work management, GPS-enabled vehicle location and weather monitoring. However, these systems are rarely integrated because of cost, timing and other factors. As a result, providing information efficiently can be a real challenge. The challenge becomes even bigger in large events and emergencies when an entire organization shifts from day-to-day work to storm assignments.

Data quality is another important factor. While most organizations have useful data, it may not always be accurate or complete. The key here is culture. Data itself must be seen as an asset, a key to driving the operational and asset decisions made day to day and in times of crisis.

Over the last few years, “big data” and “data fusion” have been at the forefront of the technical landscape. While these terms may seem like the flavor of the month, the concept is critical. They focus on bringing different data streams together, like a puzzle, building the big picture to be considered. In the past, information was compartmentalized: operations, engineering and so on. In the real world, all of these areas are related. Therefore, National Grid started by identifying opportunities to better share information internally using existing technology. National Grid quickly knew that wouldn’t be enough — making real change would require a new approach.

Areas impacted by outages are shown using a color ramp from green to red. Information such as estimated time of restoration is a click away.

New Tools, New Response

The bottom line is that to meet customers’ expectations, the utility needs to also provide its stakeholders with the best available information when, where and how they need it. And since this information comes from so many sources, National Grid needed a tool to provide that data in a consolidated way. The solution, a single source of key information, needed to reduce the time spent chasing after it. It also had to allow more time to make better informed decisions and provide consistent communications to customers.

The events that led National Grid to develop the tools that would change communications with customers all started in June 2011. Several closely timed events resulted in more than 3 million customer interruption minutes. First a series of tornadoes touched down in western Massachusetts, followed by Hurricane Irene, Tropical Depression Lee and a Halloween Nor’easter (referred to as “Snowtober”). These events descended on beleaguered customers within a span of 60 days, causing widespread damage.

Restoration was difficult, but these events also led to storm-response innovations. In the lead up to Hurricane Irene, people in the newly formed Asset Data Analytics team began to review both in-house and external data. They realized that by fusing data together, it would be easier to manage response and also provide better, more complete information to customers during future storm events.

Outages are shown with color coding based on the impact on the circuit.

The team leveraged geographic technology that was already available to bring the data together as quickly and cost-effectively as possible. Data was quickly gathered, including from external radar with the hurricane track and basic data from the outage management system. This data was overlaid on a map to provide information about customer outages by town and by feeder.

The tool was then given to the Emergency Operations Center and a handful of municipal liaisons. Feedback was overwhelmingly positive, and it led to a preview of the tool sets to National Grid’s board of directors and further development funding.

Things did not stop there. The team members continued to work on additional functionality. By the time Snowtober arrived, they were able to add internal and external crew locations, emergency restoration times, schools, hospitals, gas stations, company locations and other key information points. The reach of the tool grew to executive leadership and more municipal liaisons, and the feedback was extraordinary. No one could believe what was available, and they were hungry for more. Thoughts, concepts and ideas were coming into the team at record pace. Collaboration between parts of the utility, once thought to be separate, was happening seamlessly. They could now share information.

National Grid personnel use the blended view of information to understand and make decisions on related information such as areas impacted by outages, crew locations and weather (radar).

Technology Additions

Fortunately, National Grid customers were given a break from major storms for more than a year. In that time, the team worked to develop the platform even further, leveraging what had already been developed so even more information could flow through the system. Working with partners like ESRI, Amazon and the internal information systems group, the team developed a system that uses state-of-the-art technology in a cloud environment. This brings scalability and redundancy capabilities at much lower prices than had been achieved in the past. The team also incorporated some amazing technology from ESRI that enables data to be leveraged quickly across multiple hardware platforms, without the need for extensive specialized programming. The tool is available on PCs, tablets and phones, for iOS, Android and Windows operating systems. The result was rapid development at very low cost.

When Superstorm Sandy arrived in late October 2012, the team was ready to roll with what was now called the Situational Analysis Tool, which provided fully fused internal and external information:

  • Outage restoration times
  • Customers impacted
  • National Grid and external crew GPS locations
  • Community liaison locations and contact information
  • Company and logistics facility locations
  • Weather data including wind speeds and flood zones
  • Gas assets
  • Critical customer locations (hospitals and schools).

During the previous three months, the team also took advantage of the agnostic nature of some mobile device development tools to prototype and test an iOS/Android-based damage assessment tool in the field. Based on positive feedback, the tool was piloted in New England during the storm. The tool was now in the hands of personnel from different parts of the utility: leadership, operations and customer communications.

As the storm unfolded, everyone was able to use the common information and tool to work and be on the same page, so to speak. This allowed consistent and effective communications, both inside and outside the utility. Most notable was the feedback from municipal and regulatory contacts. They appreciated how fast community liaisons could give them the information needed to make their decisions. The easy-to-use damage-assessment tool was successful, too. It facilitated fast damage communications with personnel. This was previously a point of significant delays. The tool was also used to relay photos of significant damage from the field to others for review.

Staging area for National Grid and mutual aid crews for Hurricane Sandy response.

A New Horizon

National Grid had turned a corner. What had been a strained relationship with customers and their advocates was now one in which all stakeholders’ needs were met. It was the right path.

In yet another major storm event, February 2013’s winter storm Nemo hit National Grid’s service territory. It brought 40 inches (1 m) of snow and heavy winds. During this event, the damage-assessment program was expanded even more, and the tool was made available to an even wider audience and more data feeds were added. One of these was Twitter. What customers were saying about storm response was visible by relative locations based on Twitter settings. Information also could be sent out through these social media feeds, including pictures to help the utility communicate actual damage to customers. In this age of social media, the possibilities have been expanded.

Looking Ahead

National Grid is excited about the benefits it is reaping from new technologies. There are plans to use these and other new technologies in the future for other needs, including wires-down response and management, bringing back repair data electronically and further information-sharing possibilities.

It should be noted that, while being focused on major events just described, there are many other applications for the technology and processes in the day-to-day utility operations, including vegetation management and work closure.

The major lesson learned from this endeavor is that customers and their expectations are changing. The best results come when everyone is working from the same sources and communicate the same messages. This does not need to be a costly effort. Technology and costs have aligned to make it easier to jump in and get benefit quickly with a small investment. Go for it!

Cheri Warren ([email protected]) serves as vice president of asset management at National Grid. She and her team develop 15-year capital plans (about US$1 billion invested annually), maintenance policies and strategies, define all asset information governance, develop complex geospatial models for predicting performance, do RD&D, and set the future direction for assets through smart grid/utility of the future. She has BSEE and MSEE degrees and is on the IEEE board of directors.

Preston Large ([email protected]) is the manager of asset data and analytics. He oversees processes related to the use of electric and gas asset information for National Grid in the U.S. His team works to develop processes and tools to better manage information quality and availability while bringing innovative approaches to raise awareness of how data impacts decisions and actions at all levels of the organization.

Companies mentioned:

Amazon | www.amazon.com

ESRI | www.esri.com

National Grid | www.nationalgridus.com

Voice your opinion!

To join the conversation, and become an exclusive member of T&D World, create an account today!