1 why is forecasting important
play

1)Why is Forecasting Important? The stability of the power grid is - PDF document

Has Wind Energy Forecasting Solved the Challenge Posed by Intermittency? Evidence from the United Kingdom Kevin F. Forbes Ernest M. Zampelli School of Business and Economics The Catholic University of America Forbes@CUA.edu 9 th Annual


  1. Has Wind Energy Forecasting Solved the Challenge Posed by Intermittency? Evidence from the United Kingdom Kevin F. Forbes Ernest M. Zampelli School of Business and Economics The Catholic University of America Forbes@CUA.edu 9 th Annual Trans-Atlantic Infraday Federal Energy Regulatory Commission Washington, DC 30 October 2015 The Organization of this Talk 1)Why is Forecasting Important? 2) The Literature on Wind Energy Forecast Accuracy 3) What is the level of forecast skill ? Specifically, what does the Mean Squared Error Skill Score (MSESS) indicate about the solar and wind energy forecasts? How does this level of accuracy compare to the accuracy of the load forecasts? 4)From the point of view of a system operator, how does wind energy compare with conventional forms of generation? 5)What are the prospects for improving the accuracy of the wind, solar, and load forecasts? 2

  2. 1)Why is Forecasting Important? • The stability of the power grid is enhanced when forecasts are more accurate. This is important because blackouts have very high societal costs • Some forms of balancing technologies such as open-cycle gas turbines have above average emissions factors. • The market price of upward balancing power can be very costly. 3 Errors in the Day-Ahead Load Forecast for New York City and the Differential between the Real-Time and Day-Ahead Prices in New York City, 6 August 2009 – 30 June 2013. Note: Excludes the period of time when operations were affected by Superstorm Sandy in late October 2012 4

  3. 2) The Literature on Forecast Accuracy Some researchers calculate a root-mean-squared error of the forecasts and then weight it by the capacity of the equipment used to produce the energy. The reported capacity weighted root mean squared errors (CWRMSE) are usually less than 10 percent. Adherents of this approach include Lange, et al. (2006, 2007), Cali et al. (2006), Krauss, et al. (2006), Holttinen, et al. (2006), Kariniotakis, et al. (2006), and even NERC (2010, p. 9). In a publication entitled, “Wind Power Myths Debunked,” Milligan, et al. (2009) draw on research from Germany to argue that it is a fiction that wind energy is difficult to forecast. In their words: “In other research conducted in Germany, typical wind forecast errors for a single wind project are 10% to 15% root mean-squared error ( RMSE ) of installed wind capacity (emphasis added) but drop to 5% to 7% for all of Germany.” (Milligan, et al. 2009, p. 93) The UK’s Royal Academy of Engineering (2014, p. 33) has noted that wind energy’s capacity weighted forecast error of about five percent is evidence that that the wind energy forecasts are highly accurate. A report by the IPCC ( 2012 p, 623) on renewable energy indicates that wind energy is moderately predictable as evidenced by a capacity weighted RMS forecast error that is less than 10%. Solar energy is reported to be even more accurate. 5 The Literature on Forecast Accuracy (Continued) NREL (2013) implicitly endorses capacity weighted RMSEs for wind energy but makes use of energy weighted RMSEs when discussing the accuracy of load forecasts. In contrast, Forbes et. al. (2012) calculate a root-mean-squared forecast error for wind energy in nine electricity control areas. The RMSEs are normalized by the mean level of wind energy that is produced. The reported energy weighted root mean squared errors (EWRMSE) are in excess of 20 %. 6

  4. 3) Using The Mean-Squared-Error Skill Score (MSESS) to Assess Forecast Accuracy • A useful alternative to both the energy weighted and capacity weighted RMSE is the mean-squared-error skill score (MSESS). With this metric, one can evaluate the skill of a forecast relative to a persistence forecast, a persistence forecast being a period-ahead forecast that assumes that the outcome in period t equals the outcome in period t-1. The MSESS with the persistence forecast as a reference is calculated as follows: Where is the mean squared error of the forecast that is being evaluated and is the mean squared error a persistence forecast. A perfect forecast would have a MSESS equal to one. A MSESS equal to zero indicates that the forecast skill is equal to that of a persistence forecast. A negative MSESS indicates that the forecast under evaluation is inferior to a persistence forecast. 7 How accurate are the forecasts? • MSESS were computed for the following zones and/or control areas: • Bonneville Power Administration • CAISO: SP15 and NP15 • MISO • PJM • 50Hertz in Germany • Amprion in Germany • Elia in Belgium • RTE in France • National Grid in Great Britain • Finland • Sweden • Norway • Eastern Denmark • Western Denmark • When possible the MSESS are reported for Wind, Solar, and Load 8

  5. Mean Squared Error Skill Scores (MSESS) with a Persistence Forecast as Reference Control MSESS Area/Zone Forecast Type Sample Period Observations Granularity 50Hertz 1Jan2011 – 104,590 Quarter-Hour (Germany) Day-Ahead Load 31Dec2013 -62.7486 Day-Ahead 1Jan2011 – 104,590 Quarter-Hour Wind 31Dec2013 -31.3501 1Jan2011 – 54,545 Quarter-Hour Day-Ahead Solar 31Dec2013 -5.2683 1 Amprion 1Jan2011 – 103,326 Quarter-Hour (Germany) Day-Ahead Load 31Dec2013 -12.3308 Day-Ahead 1Jan2011 – 103,326 Quarter-Hour Wind 31Dec2013 -14.5887 Day-Ahead Solar 1Jan2011 – 55,498 Quarter-Hour 31Dec2013 -11.2069 1 9 Mean Squared Error Skill Scores (MSESS) with a Persistence Forecast as Reference (Continued) Control MSESS Area/Zone Forecast Type Sample Period Observations Granularity 1Jan2013 – California ISO Day-Ahead Load 8,760 Hourly 31Dec2013 0.6026 1Jan2013 – NP15 Day-Ahead Wind 31Dec2013 8,704 Hourly -6.1401 1Jan2013 – NP15 Hour-Ahead Wind 31Dec2013 8,704 Hourly -2.3605 1Jan2013 – NP15 Day-Ahead Solar 31Dec2013 8,666 Hourly -3.2002 1Jan2013 – NP15 Hour-Ahead Solar 31Dec2013 8,666 Hourly -2.4846 1Jan2013 – SP15 Day-Ahead Wind 31Dec2013 8,752 Hourly -4.8210 1Jan2013 – SP15 Hour-Ahead Wind 31Dec2013 8,752 Hourly -2.1894 1Jan2013 – SP15 Day-Ahead Solar 31Dec2013 8,752 Hourly 0.7050 1Jan2013 – SP15 Hour-Ahead Solar 31Dec2013 8,752 Hourly 0.7972 10

  6. Mean Squared Error Skill Scores (MSESS) with a Persistence Forecast as Reference (Continued) Control Area/Zone Forecast Type Sample Period Observations Granularity MSESS Belgium Day-Ahead Solar 1Jan2013 – 31Dec2013 17,921 Quarter-Hour -12.262 1 -9.793 1 1Jan2013 – 31Dec2013 Intra-Day Solar 11,278 Quarter-Hour 1Jan2012 – France Day-Ahead Load 35,088 Half-Hourly 31Dec2013 0.3842 1Jan2012 – Day-Ahead Wind 17,349 Hourly 31Dec2013 -5.7375 Hour 1 Same Day, 1Jan2012 – 31Dec2013 Wind 15,109 Hourly -5.2889 1Jan2011 – Norway Day-Ahead Load 26,160 Hourly 31Dec2013 0.1870 Sweden Day-Ahead Load 26,160 Hourly 0.2008 1Jan2011 – 31Dec2013 Finland Day-Ahead Load 26,159 Hourly 0.0486 1Jan2011 – 31Dec2013 Eastern Denmark Day-Ahead Load 26,160 Hourly 0.3953 1Jan2011 – 31Dec2013 Day-Ahead Wind 26,107 Hourly -2.7507 1Jan2011 – 31Dec2013 Western Denmark Day-Ahead Load 26,160 Hourly 0.6560 1Jan2011 – 31Dec2013 Day-Ahead Wind 26,105 Hourly 11 -3.6749 1Jan2011 – 31Dec2013 Mean Squared Error Skill Scores (MSESS) with a Persistence Forecast as Reference (Continued) Control Area/Zone Forecast Type Sample Period Observations Granularity MSESS MISO Day-Ahead Wind Energy 1Jan2011 – 31Dec2013 26,303 Hourly -4.3873 PJM Day-Ahead Load 1Jan2011 – 31Dec2013 26,160 Hourly 0.4727 New York City Day-Ahead Load 1Jan2011 – 31Dec2013 25,675 Hourly 0.1703 Bonneville Power Five Minute-Ahead Wind 1Jan2012 – 31Dec2013 206,477 Five minutes -36.2576 2 Hour-Ahead Wind 1Jan2012 – 31Dec2013 16,847 Hourly -63.5760 2 30,477 Day-Ahead Load 1Jan2012 – 31Dec2013 Half-Hourly 0.62 Great Britain -19.03 2 Day-Ahead Wind 30,477 1Jan2012 – 31Dec2013 Half-Hourly 1 Daylight portion of the sample period 2 MSESS calculation excludes periods in which wind energy production was curtailed by the 12 system operator.

  7. 4)From the point of view of a system operator, how does wind energy compare with conventional forms of generation? Evidence from Great Britain • In Great Britain, each generating station informs the system operator of its intended level of generation one hour prior to real-time. This value is known as the final physical notification (FPN). • Generators also submit bids (a proposal to reduce generation) and offers (a proposal in increase generation) to provide balancing services • During real-time, the system operator accepts the bids and offers based on system conditions. • In short, the revised generation schedule equals the FPN plus the level of balancing services volume requested by the system operator. • Failure to follow the revised generation schedule gives rise to an electricity market imbalance that needs to be resolved by other generators. 13 The Revised Generation Schedules vs Actual Generation: The Case of Coal in Great Britain 8000 10000 12000 Metered Generation (MWh) 6000 4000 2000 EWRMSE = 2.5 % 0 0 2000 4000 6000 8000 10000 12000 Scheduled Generation including Balancing Actions (MWh) 14

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend