Photo by Igor Akimov, Dreamstime.com
Igor Akimoc, Dreamstime

Advancements in Distribution Grid Monitoring Enable Improved Voltage Control for Utilities

Aug. 4, 2020
The proliferation of larger scale distributed energy resources (DER) along feeders is rendering traditional models and regulation techniques incapable of maintaining delivered voltages within ANSI C84.1 guidelines.

The power distribution grid is undergoing unprecedented levels of change. The traditional one-way model of voltage regulation presumed voltages dropping predictably along feeders from substation to customers. However, the proliferation of larger scale distributed energy resources (DER) along feeders is rendering traditional models and regulation techniques incapable of maintaining delivered voltages within ANSI C84.1 guidelines.

This is spurring new approaches in grid measurement, monitoring and control that provide real time measurements that enable distribution management applications to better manage voltages and maintain high power quality.

The traditional power delivery model pushes electricity from a centralized power generation plant through distribution feeders to the point of consumption. Power is consumed along the line with utilities using tap changers, voltage regulators and capacitor banks to regulate voltage to ensure delivery remains within an ANSI guideline range of +/- 5% all the way to the end of the line. Historically, the key concern was ensuring voltages did not fall below or above these standards.

Enter DER, electricity-producing resources or controllable loads are connected to a local distribution system. DER can include solar panels, wind turbines, battery storage, generators and electric vehicles.

These points of power generation inject electricity along the distribution feeder, which may increase or decrease voltage levels outside ANSI guidelines. In other words, increasing integration of renewables means variable load and generation fluctuations that work against the constant voltage profile model.

In addition, solar and wind DER are, by nature, intermittent. Managing unpredictable intermittency without measurement, monitoring and control is even more difficult and may result in oscillatory voltages in the system. Voltage rises at injection points may also create reverse systemic power flow.

As a result, utilities require more advanced power monitoring and control systems that can precisely and quickly measure voltage to enable their distribution management systems (DMS) to respond and regulate the voltage on their feeder lines. But this means DER integration needs real-time data to implement their control strategies.

The concern is that unpredictable voltage delivery will disrupt service to household, commercial and industrial customers all along the feeder, and could damage motors and equipment along the line as well. The challenge of effectively controlling unpredictable, variable and potentially bi-directional voltage flow starts with measurement. The only way to control this kind of variability is to have measurements along distribution feeder lines that are accurate and that can communicate data to control systems fast enough to modulate the voltage and keep it under control. Essentially – in real-time.

Voltage delivery monitoring and control can be the domain of DMS. These systems have evolved over the years with advanced DMS models now in use that use Volt/Var optimization (VVO) where capacitor banks, voltage regulators and solid-state systems are switched on and off to maintain acceptable levels of power factor and voltage. More recently, distributed energy resource management systems (DERMS) have emerged in response to the increasing amount of renewables-based distributed energy resources. These are complex control systems for monitoring and controlling sources of energy.

DERMS requires accurate, real-time measurement of voltages, loads, reactive power, fault data, and even weather data. A key consideration has been how to design and install these monitoring systems in a way that is cost-effective for utilities. This has called into question the traditional approach of grid monitoring with conventional magnetic current transformer (CT) and potential transformer (PT). The installed cost of CTs and PTs is expensive and time-consuming, plus the feeder must be powered down for their installation.

An alternative lower-cost approach is to employ low voltage (0-10V ac) sensor technology for all voltage and current measurements. These sensors are safe, accurate for all required measurements and can be installed without taking an outage.

Three raw voltages and currents can be wired to a distribution grid monitor (DGM), a pole-top measurement system, and dozens of useful measurements made including voltages to better that .5%, loads, power factor, real and reactive power. An ANSI 51 overcurrent element enables reporting of fault pickup and peak fault currents. All these measurements are reported to DMS and DERMS through DNP3 over radio.

Given that the trends indicate DER integration will increase each year, so too will the need to maintain voltages, power factor and frequency within desired limits. New grid measurement and monitoring technologies are essential to keep these factors under control.

Voice your opinion!

To join the conversation, and become an exclusive member of T&D World, create an account today!