C Comparison of Hurricane Loss Comparison of Hurricane Loss C i - - PowerPoint PPT Presentation

c comparison of hurricane loss comparison of hurricane
SMART_READER_LITE
LIVE PREVIEW

C Comparison of Hurricane Loss Comparison of Hurricane Loss C i - - PowerPoint PPT Presentation

C Comparison of Hurricane Loss Comparison of Hurricane Loss C i i f H f H i i L L Projection Models Projection Models FSU Storm Center FSU Storm Center June 19, 2008 June 19, 2008 1 Hurricane Modeling Background Hurricane


slide-1
SLIDE 1

C i f H i L C i f H i L Comparison of Hurricane Loss Projection Models Comparison of Hurricane Loss Projection Models FSU Storm Center FSU Storm Center

June 19, 2008 June 19, 2008

1

slide-2
SLIDE 2

Hurricane Modeling Background Hurricane Modeling Background

  • Traditional methods of projecting hurricane loss cost

p j g were considered inadequate after Hurricane Andrew.

  • Hurricane modeling provided a more scientific

approach, but has been considered controversial due to the proprietary nature of the models. p p y

  • The Legislature recognized the need for expert

evaluation of computer models to resolve conflicts among actuarial professionals and created a Commission.

2

Co ss o

slide-3
SLIDE 3

Creation of Commission Creation of Commission

  • In 1995, the Florida Legislature created the 11

b Fl id C i i H i L member Florida Commission on Hurricane Loss Projection Methodology (see s. 627.0628, F.S.)

  • Panel of Independent Experts formed to “provide

the most actuarially sophisticated guidelines and d d f j i f h i l standards for projection of hurricane losses possible.”

3

slide-4
SLIDE 4

Composition of the Commission Composition of the Commission

  • Three actuaries:

– OIR (appointed by Director of OIR) ( pp y ) – Insurance Industry (appointed by CFO) – Actuary Member of the FHCF Advisory Council E t f th St t U i it S t ( i t d b th CFO)

  • Experts from the State University System (appointed by the CFO):

– Insurance Finance (Actuarial Science) – Statistics (Insurance) ( ) – Computer System Design – Meteorology (Hurricanes) C

  • Insurance Consumer Advocate
  • Executive Director of Citizens
  • Senior FHCF Officer

4

  • Director, Division of Emergency Management
slide-5
SLIDE 5

Names of Commission Members & Professional Team Names of Commission Members & Professional Team Members Over the Last 12 Years Members Over the Last 12 Years Names of Commission Members & Professional Team Names of Commission Members & Professional Team Members Over the Last 12 Years Members Over the Last 12 Years (current members bolded) (current members bolded) – 55 Experts Involved 55 Experts Involved (current members bolded) (current members bolded) – 55 Experts Involved 55 Experts Involved

Insurance Executive Director, Director, Consumer Advocate Senior FHCF Officer Citizens Emergency Management FHCF Actuary OIR Actuary Terry Butler Jack Nicholson, PhD Scott Wallace Craig Fugate Larry Johnson, FCAS Howard Eagelfeld, FCAS Bob Milligan Bob Ricker Joe Myers Alice Gannon, FCAS Sri Ramanujam, FCAS Steve Burgess Jay Newman Myron Dye, FCAS Kay Cleary, FCAS Lauri Goldman Ken Ritzenthaler, ACAS Elsie Crowell

1 3 2 3 4

Insurance Computer System Industry Actuary Finance Expert Statistics Expert Design Expert Meteorology Expert Kristen Bessette, FCAS Randy Dumm, PhD Sneh Gulati, PhD Jai Navlakha, PhD Hugh Willoughby, PhD Steve Ludwig, FCAS David Nye, PhD Tim Lynch, PhD David Coursey, PhD Jim O'Brien, PhD Mark Homan, FCAS Carol Taylor West, PhD Shahid Hamid, PhD Kevin Kloesel, PhD

5 2 Total Commission Members - 36

Dan Powell, FCAS Naphtali David Rishe, PhD Peter Ray, PhD Charles Hughes, PhD Actuary Statistician Computer Scientist Meteorologist Engineer

4 3 5 4

Marty Simons, ACAS Mark Johnson, PhD Paul Fishwick, PhD Jenni Evans, PhD Fred Stolaski, PE Chuck Watson Mark Brannon, FCAS (backup) Ron Iman, PhD (backup) Dick Nance, PhD (backup) Tom Schroeder, PhD (backup) Masoud Zadeh, PhD, PE (backup) Julie Serakos David Cox, FCAS Ben Fitzpatrick, PhD Peter Ray, PhD Nur Yazdani, PhD, PE Steve Lyons, PhD Nariman Balsara, PE John Pepper, PE

3 3 2 4 5 2 Total Professional T M b 19

5

5 Team Members - 19

slide-6
SLIDE 6

Summary of Commission Activities Summary of Commission Activities Summary of Commission Activities Summary of Commission Activities

  • 128 meetings over 12 year period
  • Involvement of 55 different experts (36 Commission

members & 19 Professional Team members)

  • 52 on-site reviews/audits
  • Annual Report of Activities published by November 1
  • Rigorous public disclosure, on-site audits, and evaluation

process (12 years of documentation)

  • Reviewed eight (8) different models over 12 years
  • Five (5) models acceptable under the current Standards

6

  • Total Cost to Date: over $4 million
slide-7
SLIDE 7

The Professional Team The Professional Team

Statistics Inputs Meteorology Engineering

Expert Evaluation Requires: Statistician

Actuarial

Meteorologist Structural Engineer Actuary Computer Scientist 55 On-Site Reviews to date

Outputs

Hurricane Computer Models

Computer Programming

7

Hurricane Computer Models

slide-8
SLIDE 8

8

The Acceptability Process The Acceptability Process

Jun Aug Sept Oct Nov Dec Jan Feb Mar Apr May Jul Jun

Revising & Developing Standards Reviewing Models November 1 February 28

8

y

Modelers have 4 months to revise models

slide-9
SLIDE 9

Principles Principles

(Examples*) (Examples*)

Principles Principles

(Examples*) (Examples*) (Examples ) (Examples ) (Examples ) (Examples )

  • All models or methods shall be theoretically sound
  • All models or methods shall be theoretically sound.
  • Models or methods shall not be biased to overstate or

Models or methods shall not be biased to overstate or understate results.

  • The output of models or methods shall be reasonable and

the modeler shall demonstrate its reasonableness.

9 *See page 15 of the Report of Activities for the 20 Principles adopted by the Commission.

slide-10
SLIDE 10

Requirements Requirements Requirements Requirements q

General Meteorological Vulnerability Actuarial Statistical Computer General Meteorological Vulnerability Actuarial Statistical Computer Standards

36 5 6 2 10 6 7 36

(88 subparts) (8 subparts) (12 subparts) (9 subparts) (29 subparts) (7 subparts) (23 subparts)

Disclosures

144 28 33 11 38 27 7 144

Forms

26 7 3 3 8 5 26

On-Site Audit Requirements

142 13 28 10 33 29 29

10 10

142

slide-11
SLIDE 11

Overview of Hurricane Loss Models Overview of Hurricane Loss Models

  • Input Data Bases
  • Wind Models

Surface Friction and Topography Adjustments

  • Surface Friction and Topography Adjustments
  • Damage Functions
  • Frequency of Occurrence of Events
  • Supporting Decisions. For example:

What constitutes an event? Spatial aggregation of numerical results

11 11

Spatial aggregation of numerical results

slide-12
SLIDE 12

Traditional Loss Models Traditional Loss Models

Historical Land Cover Topography Exposure Historical Storm Data Topography Data Data Damage Function Wind Model Friction Model Storm Set

Actuarial Module

Frequency Model

Loss

Historical data can be used directly, statistically smoothed, or otherwise analyzed to create a data base of storm characteristics used to create the storm set for

Costs

12 12

storm characteristics used to create the storm set for simulations.

slide-13
SLIDE 13

Research/Comparison Approach Research/Comparison Approach

  • Nine wind fields

F f f i ti th d

  • Four surface friction methods
  • Nine damage (vulnerability) functions

Nine damage (vulnerability) functions

  • Three frequency methods
  • 9 x 4 x 9 x 3 = 972 models

Other options include changing historical storm data bases, exposures, and other storm assumptions. Result

13 13

p p is thousands of possible outcomes.

slide-14
SLIDE 14

Input Data Bases Input Data Bases

  • Digital Elevation Model (topography)

Not all models use topography Ridge and valley effects important in upland areas Ridge and valley effects important in upland areas

  • Land Cover/Land Use

Friction effects to adjust wind impacts on structures at surface Friction effects to adjust wind impacts on structures at surface

  • Historical Storm Track and Intensity Data

Req ired to sim late indi id al storms for comparison Required to simulate individual storms for comparison with observed losses. Used as a basis for the determination

  • f frequency of occurrence and other storm characteristics
  • Exposure Data Set

Location, characteristics, and value of properties at risk

14 14

slide-15
SLIDE 15

Range of Results from Public Domain Models Range of Results from Public Domain Models

16 14 16

972 Models – Range of results: maximum, median, and minimum

10 12 6 8 2 4

SUW ANNEE HAM ILTO N M ADISO N CO LUM BIA LAFAYETTE BRADFO RD G ILCHRIST JEFFERSO N BAKER ALACHUA UNIO N TAYLO R CLAY DIXIE M ARIO N PUTNAM VO LUSIA LEVY DUVAL SEM INO LE LAKE W AKULLA LEO N JACKSO N NASSAU SUM TER FLAG LER G ADSDEN SAINT JO HN O RANG E PO LK HO LM ES HILLSBO RO U HARDEE HIG HLANDS O SCEO LA CITRUS BREVARD HERNANDO W ASHING TO PASCO O KEECHO BE DESO TO SANTA RO SA LIBERTY G LADES ESCAM BIA PINELLAS W ALTO N M ANATEE CALHO UN O KALO O SA HENDRY INDIAN RIVE SARASO TA FRANKLIN SAINT LUCIE CHARLO TTE BAY BRO W ARD M ARTIN G ULF PALM BEACH LEE M IAM I-DADE CO LLIER M O NRO E

15 15

E N NS O UG H S O O N EE A ER E E H E

Median Min max

slide-16
SLIDE 16

Why Are Models Different? Why Are Models Different? y

Examined many additional aspects of why models differ b id th b i f diff t ti

  • Meteorological input variables

besides the obvious one of different equations:

  • Historical data
  • Land cover
  • Exposure data bases and aggregation

16 16

slide-17
SLIDE 17

Impact of Meteorological Assumptions Impact of Meteorological Assumptions

17 17

slide-18
SLIDE 18

The Explosion in Hurricane Data The Explosion in Hurricane Data

Data per day on landfalling Hurricane

1.00E+10 1.00E+11

Active low earth orbiting (POES) sensors and scatterometers, GPS dropsondes

1.00E+07 1.00E+08 1.00E+09

Coastal Radar Networks Microwave Satellite Data (sounders) and improved GEOS satellite sensors

1.00E+05 1.00E+06 Data (bytes)

Coastal Radar Networks in place Geosynchronous satellites

1 00E+02 1.00E+03 1.00E+04

Airports begin systematic weather reporting Aircraft begin penetrating h i t ll t d t

1.00E+00 1.00E+01 1.00E+02

1840 1860 1880 1900 1920 1940 1960 1980 2000 2020 hurricanes to collect data GTS (telegraph), Lloyds ship reports

18 18

1840 1860 1880 1900 1920 1940 1960 1980 2000 2020

Year

Note – Each vertical division represents 10 times the amount of data.

slide-19
SLIDE 19

Short Term Trends in Hurricane Winds Short Term Trends in Hurricane Winds

vmaxkts vmaxkts 100

104.9 kts

Data for

60 80 e Winds

80.8 kts 73.3 kts

DeFuniak Springs, Walton County

40 60 Peak Hurricane

County

20

4.8 kts

1850 1870 1890 1910 1930 1950 1970 1990 2010

vmaxkts

Probability of Hurricane Force Winds:

19 19

y Overall: 1 in 7.8 El Nino Year: 1 in 6.9 La Nina Year: 1 in 15.4

slide-20
SLIDE 20

Frequencies and Data Base Comparison Frequencies and Data Base Comparison

The image cannot be displayed. Your computer may not have enough memory to open the image, or the image may have been corrupted. Restart your computer, and then open the file again. If the red x still appears, you may have to delete the image and then insert it again.

Does one or two years of additional history make diff ? a difference? Can climate Can climate models be used instead

  • f parametric
  • f parametric

models?

20 20

slide-21
SLIDE 21

Summary: Why Models Vary Summary: Why Models Vary

  • Model component selections, especially wind field.

M t l i l i t i bl

  • Meteorological input variables –

very sensitive to assumptions, more sensitive than

  • ur ability to measure, can drive wind model

selections selections.

  • Land cover and other support data bases
  • ut of date can make significant difference
  • ut of date can make significant difference.
  • Spatial aggregation and representation

level of aggregation can bias results; level of aggregation can bias results; ZIP Codes, especially in rural areas can introduce significant errors.

21 21

slide-22
SLIDE 22

Where do we stand? Where do we stand? Where do we stand? Where do we stand?

  • While we can’t expect individual models to agree, we

p g , can understand the variation we should expect from models.

  • With the results of the above studies, especially the

results of nearly one thousand public domain hurricane loss models, the Commission has a baseline against which to evaluate individual model submissions submissions.

22 22

slide-23
SLIDE 23

All 5 Models + Public Domain Loss Costs All 5 Models + Public Domain Loss Costs

14 12 14 8 10 6 2 4

SUW ANN HAMILTO MADISO COLUMB LAFAYET BRADFO GILCHR JEFFERS BAKER ALACHU UNION TAYLOR CLAY DIXIE MARION PUTNAM VOLUSIA LEVY DUVAL SEMINO LAKE W AKULL LEON JACKSO NASSAU SUMTER FLAGLER GADSDE SAINT JO ORANGE POLK HOLMES HILLSBO HARDEE HIGHLAN OSCEOL CITRUS BREVAR HERNAN W ASHIN PASCO OKEECH DESOTO SANTA R LIBERTY GLADES ESCAMB PINELLA W ALTON MANATE CALHOU OKALOO HENDRY INDIAN R SARASO FRANKL SAINT LU CHARLO BAY BROW AR MARTIN GULF PALM BE LEE MIAMI-DA COLLIER MONRO

23 23

NEE ON ON BIA TTE ORD RIST RSON UA R N M A OLE LA ON U R R EN OHNS E S OROUGH E NDS LA RD NDO NGTON HOBEE O ROSA Y S BIA AS N EE UN OSA Y RIVER OTA LIN UCIE OTTE RD EACH DADE R OE

Median Min Max FPM06 AIR06 ARA06 EQE06 RMS06

slide-24
SLIDE 24

Loss Costs Loss Costs

ARA EQE EQE AIR AIR ARA EQE EQE

AIR AIR ARA EQE EQE

Loss Cost per $1000

  • f Exposure for

Wood Frame

FPM FPM RMS RMS

FPM FPM RMS RMS

Wood Frame

FPM FPM RMS RMS

24 24

slide-25
SLIDE 25

Rank Comparison Rank Comparison

AIR AIR ARA EQE EQE

Rank with respect to the 972 Public

FPM FPM RMS RMS

Domain Models

FPM FPM RMS RMS

25 25

slide-26
SLIDE 26

Hypothetical Probable Maximum Loss Comparison FHCF Hypothetical Exposure Data Hypothetical Probable Maximum Loss Comparison FHCF Hypothetical Exposure Data yp p Form S Form S-

  • 2

2 yp p Form S Form S-

  • 2

2

Return P b bilit Ti AIR ARA EQE EQE FPM RMS Probability Time AIR ARA EQE EQE FPM RMS 0.4% 250 51.7 57.9 50.2 50.2 44.3 51.7 1% 100 36.3 41.4 36.7 36.7 35.6 34.3 2% 50 25 1 29 2 25 4 25 4 28 8 23 7 2% 50 25.1 29.2 25.4 25.4 28.8 23.7 AAL 2.4 2.7 2.4 2.4 3.2 2.7 AAL: Average Annual Loss Return time in years All numbers for the models in $ millions

The probable maximum loss (PML) data is calculated

  • n a hypothetical exposure data set and is not an

26 26

y indication of the actual PML for Florida. These results should be used for comparison purposes only.

slide-27
SLIDE 27

Statewide Dynamic Range of Loss Costs Statewide Dynamic Range of Loss Costs

R ti f t 5 t b tt 5 l t Ratio of top 5 to bottom 5 loss costs

35 40

Maximum of 972

30 35

Median of 972

20 25 Ratio 10 15

Minimum of 972

5

27 27

Ratio = (Sum Top 5 County Loss Costs)/(Sum Bottom 5 County Loss Costs)

AIR ARA EQE FPM RMS

slide-28
SLIDE 28

Overall Summary Overall Summary

  • Fundamentally, models vary because their

developers select different on the surface equally developers select different, on the surface equally valid, methods to solve the four basic components (frequency, wind, friction, and vulnerability) of loss modeling. g

  • Even if we could decide on a “perfect” solution for

the four components, the uncertainty in l i l d h i d meteorological parameters and other input data would cause significant uncertainty in loss costs.

28 28

slide-29
SLIDE 29

Primary Findings: General Primary Findings: General

  • Aside from some anomalies, the output ranges
  • f models submitted to the Commission under

the 2006 Standards (found acceptable in 2007) fall within the range one would expect given the universe of possible scientifically valid universe of possible scientifically valid approaches.

  • Some year to year variation is expected from

any model and particularly a young model.

29 29

slide-30
SLIDE 30

Moving the Modeling Process Forward: Moving the Modeling Process Forward: Areas of Past, Current and Future Areas of Past, Current and Future Moving the Modeling Process Forward: Moving the Modeling Process Forward: Areas of Past, Current and Future Areas of Past, Current and Future , Investigation Investigation , Investigation Investigation

  • Demand Surge

Demand Surge

  • Commercial Residential
  • Climate Models
  • Risk Loadings
  • Others Ways to Improve the Process

30 30

slide-31
SLIDE 31

Rank Comparison Table Rank Comparison Table

This table shows the number of counties (out of 67) in each This table shows the number of counties (out of 67) in each quartile of the 972 Public Domain model outputs, as well as those exceeding the maximum or falling below the minimum.

Model below min-25 25-50 50-75 75-max above 2005 Model below min-25 25-50 50-75 75-max above 2005 AIR 2 23 17 10 15 ARA 5 4 6 40 12 EQE 13 26 14 13 1 RMS 12 19 18 8 10 AIR 2 23 17 10 15 ARA 5 4 6 40 12 EQE 13 26 14 13 1 RMS 12 19 18 8 10 RMS 12 19 18 8 10 FPM 4 10 7 4 18 24 2006 AIR 16 18 9 20 4 FPM 4 10 7 4 18 24 2006 AIR 16 18 9 20 4 AIR 16 18 9 20 4 ARA 11 6 8 13 21 8 EQE 13 21 13 19 1 RMS 8 18 23 5 12 1 FPM 2 8 8 7 23 19 AIR 16 18 9 20 4 ARA 11 6 8 13 21 8 EQE 13 21 13 19 1 RMS 8 18 23 5 12 1 FPM 2 8 8 7 23 19

31 31

FPM 2 8 8 7 23 19 FPM 2 8 8 7 23 19

The Public Model version 1.5 is used to determine ranks in 2005.

slide-32
SLIDE 32

Hypothetical PML Comparison FHCF Hypothetical Exposure Data Hypothetical PML Comparison FHCF Hypothetical Exposure Data Form S Form S-

  • 2

2 Form S Form S-

  • 2

2

60 60 40 50 AIR ns 40 50 AIR ns 30 40 AIR ARA EQE FPM $ Million 30 40 AIR ARA EQE FPM $ Million 10 20 FPM RMS 10 20 FPM RMS 250 100 50 Return Times 250 100 50 Return Times

32 32

Return Times Return Times

slide-33
SLIDE 33

Hypothetical PML Comparison FHCF Hypothetical Exposure Data Hypothetical PML Comparison FHCF Hypothetical Exposure Data FHCF Hypothetical Exposure Data Form S Form S-

  • 2, Average Annual Loss

2, Average Annual Loss FHCF Hypothetical Exposure Data Form S Form S-

  • 2, Average Annual Loss

2, Average Annual Loss

3 3.5 3 3.5 2 2.5 AIR ARA

Millions

2 2.5 AIR ARA

Millions

1 1.5 EQE FPM RMS

$ M

1 1.5 EQE FPM RMS

$ M

0.5 0.5

33 33

AAL AAL