average prem ium model actuarial research conference
play

Average Prem ium Model Actuarial Research Conference Brant - PowerPoint PPT Presentation

Average Prem ium Model Actuarial Research Conference Brant Wipperman, FCAS FCIA MSc (SFU) July 27, 2010 Motivation Revenue Requirements Monitoring our book of business 1 Basic On-Level Average Premium 630 A 625 v g 620 . P


  1. Average Prem ium Model Actuarial Research Conference Brant Wipperman, FCAS FCIA MSc (SFU) July 27, 2010

  2. Motivation • Revenue Requirements • Monitoring our book of business 1

  3. Basic On-Level Average Premium 630 A 625 v g 620 . P 615 r e m 610 i u 605 m ( 600 $ ) 2002 2003 2004 2005 2006 Year 2

  4. Personal TPL On-Level Avg. Premium 630 A 625 v g 620 . P 615 r e m 610 i u 605 m ( 600 $ ) 2005 2006 2007 2008 2009 Year 3

  5. Overview Exposure PCA Forecasts Average Premium Forecasts 4

  6. Exposure Model • Historical exposure data – Split into Personal and Commercial – Further split into vehicle use, location, and bonus-malus groups • An econometric regression model is fit to each group – Demographic – Economy 5

  7. Vehicle Use Groups • Personal – Pleasure – Commute – Business – Senior – Motorcycle – Motor home – Collector 6

  8. Location Groups Lower Mainland • Ridge Meadows • Fraser Valley • Squamish/ Whistler • Pemberton/ Hope • Okanagan • Kootenays • Cariboo • Prince George • Peace River • North Coast • South Island • Mid Island • North Island • 7

  9. Bonus-Malus Groups • Claim Rated Scale – Roadstar (43% discount) – 25% to 40% discount – 5% to 20% discount – Base or surcharge 8

  10. Overview External Historical Factors Econometric Exposure Factor Models Data Forecasts Exposure PCA Forecasts Exposure Model Average Premium Forecasts 9

  11. Historical Exposure Data • Too many groups for the average premium model • Need a dimension reduction technique • Want to keep all of the groups • Linear dependencies exist 10

  12. What is PCA? • It transforms a number of correlated variables into a smaller number of uncorrelated variables • Uses linear algebra 11

  13. PCA Notation = ⋅ T 1 A ( Z Z ) n ⋅ = λ A V V − = ⋅ 1 B V L 2 = ⋅ P Z B = ⋅ = ⋅ 1 S V L B L 2 = ⋅ T C T T 12

  14. Eigen Decomposition • Linear algebra problem • Done on correlation matrix of explanatory variables • Eigenvectors are new explanatory variables (i.e. principal components) • Each associated eigenvalue represents variability of eigenvector (or PC) 13

  15. PCA Resolves the Issues • Number of dimensions reduced • All groups ‘retained’ • Linear dependencies eliminated 14

  16. PCA PCA Process • Step 1: Create new set of explanatory variables • Step 2: Determine how many new explanatory variables to retain 15

  17. How many components? 35% % 30% o 25% f 20% V a r 15% i a 10% n c 5% e 0% 1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132 Principal Com ponent 16

  18. How many components? 88% 94% 100% 90% % 80% o 70% f 60% V 50% a r 40% i 30% a n 20% c e 10% 0% 1 2 3 4 5 6 7 8 9 1011121314151617181920212223242526272829303132 Principal Com ponent 17

  19. Overview External Historical Factors Econometric Exposure Factor Models Data Forecasts Exposure PCA Forecasts Exposure Model Chosen PCs 18

  20. Historical Exposure 30 correlated variables Data Ortho-normal transformation Principal 30 uncorrelated variables Components Scree Proportion of variance Other Chosen PCs 6 uncorrelated variables 19

  21. Overview External Historical Factors Econometric Exposure Factor Models Data Forecasts Exposure PCA Forecasts Exposure Model Historical Linear Chosen PCs Regression Average Models Premium Chosen PC Forecasts Average Premium Forecasts Average Premium Model 20

  22. Modeled vs. Actual – Personal TPL 660 A v 640 g . P 620 r e m 600 i u m 580 ( $ Jan-04 Jan-05 Jan-06 Jan-07 Jan-08 Jan-09 Jan-10 Jan-11 ) Month Modeled Actual 21

  23. Modeled vs. Actual – Personal TPL 630 A 625 v g 620 . P 615 r e 610 m i u 605 m 600 2004 2005 2006 2007 2008 2009 2010 2011 Year Actual 6 PCs 4 PCs 8 PCs 22

  24. Recap - Advantages • PCs uncorrelated • PCs organized to reduce dimensionality • Keeps most of original information • Determine contribution of each variable 23

  25. Recap - Disadvantages • PCA process not familiar • PCs can be hard to interpret • PC weights may change upon updating 24

  26. Is PCA Right For You? • Does multi-collinearity roll off your tongue too easily? • Are you confident in the set of explanatory variables? • Do you want to reduce dimensionality without throwing away information? • Have you been modeling for more than 4 consecutive hours? 25

  27. For More Information • CAS Discussion Paper – PCA and Partial Least Squares: Two Dimension Reduction Techniques for Regression • http: / / www.casact.org/ pubs/ dpp/ dpp08/ 08dpp76.pdf 26

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend