By Rao Konidena Most transmission providers and grid operators have experience using complex calculations to calculate wind capacity credit which flew under the renewable developer’s radar. Now, transmission organisations are adopting the same calculation for solar and storage.

Additionally, others who didn’t use this method before are picking up this calculation due to the amount of solar in their interconnection queue.

With the threat of blackouts and brownouts, and increasing penetrations of solar and storage in capacity forecasts, these calculations are increasingly important for renewable developers to understand and voice their opinion.

There are two industry-accepted methods that grid operators are pursuing. Only the average method is favourable for renewable developers because the marginal method discounts renewable capacity contributions and works in favour of thermal resources.

Effective Load Carrying Capability

When wind penetration was 1,000 MW, and forecasts indicated 10,000 MW in 10 years due to state Renewable Portfolio Standards (RPS), grid operators had to find a way to calculate how much capacity value they should assign for wind. This need arose when wind integration studies were done at most transmission organisations. The wind industry quickly accepted a calculation that worked with existing Loss of Load Expectation (LOLE) models because LOLE models were run to determine the planning reserve margin, where capacity credit for the wind was needed at first. Until then, there was no need for calculating capacity credit because it was widely accepted that thermal resources would get 100% because they showed up to perform during peak demand hours. Thermal resources outage statistics under peak demand were captured under a different metric, “Effective Forced Outage Rate at peak demand” (EFORd).

Keep Reading

For wind, the ELCC calculation method depended heavily on the number of historical peak load hours. Its main job was to determine the value of wind capacity if the wind was available during those top eight peak demand hours. As the Midcontinent Independent System Operator (MISO) Wind and Solar Capacity Credit report states, “Tracking the top eight daily peak hours in a year is sufficient to capture the peak load times that contribute to the annual LOLE of 0.1 days/year”.

The initial set of ELCC values for wind penetration at 10,000 MW was high in the 40% range. Hence the capacity contribution for 10,000 MW of wind was determined at 4,000 MW. As more wind was added to the grid (a total of 25,000 MW), this value fell to 15% because more wind resources were available to meet the top eight peak hours meant better reliability — less chance of blackout or brownout conditions. 

Average ELCC or Marginal ELCC?

Let us consider solar and storage in this ELCC world because some states have announced goals to be 100% carbon-free electricity (by 2040 in New York, by 2045 in California, and by 2050 in Wisconsin). Since it is common knowledge that the capacity credit falls as we increase wind penetration, renewable developers want to avoid a similar situation with solar. Hence energy storage is added at the point of interconnection to prop up the capacity value of solar. These interconnection requests are called hybrid interconnections, mostly solar+storage but could also include wind plus storage and other forms of renewables with storage.

Calculating ELCC for solar alone and solar+storage clearly shows storage capacity benefit to the grid operator. Where capacity markets exist, the grid operators must run these ELCC calculations to determine capacity credit for all renewable resources. Since most states lack the engineers to run these complex calculations, they apply the grid operator’s capacity credit calculation to renewable resources in their Integrated Resource Plan (IRP) proceedings.

With solar and storage in the mix, ELCC is now discussed in at least tow ways — an average ELCC (MISO’s wind capacity credit) and a marginal ELCC (California IRP). An average ELCC looks at averaging ELCC values calculated at each of the 8 peak demand hours. A marginal ELCC, on the other hand, is looking at the next MW that the resource can provide to meet that 1 day in 10 reliability standard because that MW could come from either a thermal or non-thermal resource. Both have their uses, but the grid operators seem to be focused on adopting marginal ELCC to ensure reliability because they are convinced that with more renewables on the grid, there is a need for more dispatchable capacity.

Interestingly, renewable advocates are showing data where thermal resources like natural gas units have issues even when assigned 100% capacity value, such as frozen gas compressor stations and pipeline unavailability, in addition to their regular operational maintenance issues. Typically, utilities exclude all supply constraints in a category called “Outside Management Control” OMC cause code in the NERC Generating Availability Data System (GADS) database, which resulted in better outage statistics (XEFORd), which meant a higher unforced capacity value for those thermal resources. —Renewable Energy World