user oriented evaluatjon of fjre spread predictjons
play

User-oriented evaluatjon of fjre spread predictjons Beth Ebert 1 , - PowerPoint PPT Presentation

User-oriented evaluatjon of fjre spread predictjons Beth Ebert 1 , Nathan Faggian 2 , Paul Fox-Hughes 1 , Chris Bridge 2 , Howard Jacobs 2 , Catherine Jolley 2 , Barb Brown 3 , Stuart Matuhews 4 , Greg Esnouf 5 1 Research and Development Branch,


  1. User-oriented evaluatjon of fjre spread predictjons Beth Ebert 1 , Nathan Faggian 2 , Paul Fox-Hughes 1 , Chris Bridge 2 , Howard Jacobs 2 , Catherine Jolley 2 , Barb Brown 3 , Stuart Matuhews 4 , Greg Esnouf 5 1 Research and Development Branch, Bureau of Meteorology, Australia 2 Weather Forecastjng Branch, Bureau of Meteorology, Australia 3 Research Applicatjons Laboratory, Natjonal Center for Atmospheric Research, USA 4 New South Wales Rural Fire Service, Australia 5 Australian Fire and Emergency Services Authoritjes Council, Australia

  2. 2

  3. What is a fjre spread simulator? Tool that models fjre characteristjcs (spatjally): T=0 • Flame height • Intensity • Rate of spread • Area of impact A simulator is a collectjon of fjre behavior T=6h models that can be used to infer the fjre danger. Weather forecasts are one of the principal drivers of the simulators. 3

  4. ‐ Cell based fjre spread models Geometric fjre spread models Adapted from: Fire Behaviour Knowledge in Australia, Cruz et, al. 2014, Bushfjre CRC, Technical Report: EP145697 4

  5. Fire spread simulators in Australia Prometheus Which is best? Phoenix Bureau of Meteorology asked to run and evaluate these fjre spread simulators for a set of common cases from around Australia Spark Australis 5

  6. User focus of the evaluatjon Consultatjon with end users (fjre agencies) • Kick-ofg workshop, site visits, consultatjons with simulator developers and fjre behavior analysts • Understand how they use fjre spread simulators • Understand what "good quality" means to them 6

  7. Verifjcatjon planning template 7

  8. What do users want to know? • Management-level users: – Which simulator is best? – Best for a partjcular case study? • Fire behavior analysts (expert users): – How accurately does this simulator predict fjre area, rate of spread, bearing? – How sensitjve is this simulator to variatjons in weather, fuel, ignitjon locatjon/tjme? • Simulator developers: – How can the uncertainty in weather inputs be quantjfjed to assist in the discriminatjon between model errors and input errors?

  9. Data 10 case studies for this project: • Fire boundaries (isochrones) from line scans or reconstructjons – Limited as agencies focus on protectjon of life and property – Prefer cases without suppression • Weather – Offjcial weather forecast grids – Weather statjon observatjons • Fuel layers from agencies • Topography 9

  10. Sample simulatjons State Mine fjre, New South Wales, 16 October 2013 10

  11. Spatjal verifjcatjon metrics Summary metric • Threat score Diagnostjc metrics • Bearing error • Forward spread error • Area error 11

  12. Evaluatjon approach For each simulator and all case studies: • Baseline performance – Simulate fjre spread using forecast weather in ignitjon grid cell(s) • Sensitjvity studies poorer – Perturb input weather – Perturb fuel, ignitjon locatjon • Relatjve and absolute performance betuer Gleckler et al. JGR 2008 CMIP3 12

  13. Estjmatjng uncertainty in weather inputs T For each case: • Verify 1-day weather forecasts at fjre locatjon against observatjons averaged over three "nearest" AWS  X X • Bin each hour for all fcst obs RH days of month in which fjre occurred • Use error PDF as template for perturbing weather inputs 14

  14. High level view - relatjve performance • Management-level users Dashboard want to know: – Which simulator is best? – Best for a partjcular case study? • Compare aggregate accuracy over all perturbed inputs to the whole populatjon (overall or for each case) 15

  15. Deeper view – accuracy & sensitjvity • Fire behaviour analysts (expert Modifjed Hinton diagram users) want to know: – How accurately does this simulator predict fjre area, rate of spread, bearing? – How sensitjve is this simulator to variatjons in weather, fuel, ignitjon locatjon/tjme? State Mine fjre, NSW, • Box size shows sensitjvity (how 16 October 2013 does IQR compare to all IQRs?) 16

  16. Deeper view – accuracy & sensitjvity • Fire behaviour analyst (expert Categorical performance diagram users) want to know: – How accurately does this Constant TS simulator predict fjre area, rate of spread, bearing? – How sensitjve is this simulator to variatjons in weather, fuel, ignitjon locatjon/tjme? State Mine fjre, NSW, 16 October 2013 • Pink = below median, Green = above median 17

  17. What did we learn? • No single simulator stood out overall as being superior to the others and none performed well in all circumstances. All simulators over-predicted some fjres and under-predicted others. • Simulators (and fjres) are sensitjve to weather, partjcularly wind. This highlights the value of an ensemble approach to the operatjonal use of fjre spread simulators. • This evaluatjon framework will be a community tool for evaluatjng fjre spread simulators, and has already prompted the community to make signifjcant improvements to their simulators. • Need more cases, and standards for observing and reportjng fjre behavior. 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend