The Data Quality Management Challenges of Solvency II
Massimiliano Neri, Associate Director, Moody’s analytics
The Data Quality Management Challenges of Solvency II Massimiliano - - PowerPoint PPT Presentation
The Data Quality Management Challenges of Solvency II Massimiliano Neri, Associate Director, Moodys analytics 2 Agenda 1. Introduction 2. Criteria to Assess Data Quality 3. Data Quality Systems and Procedures 4. Moodys Analytics Best
The Data Quality Management Challenges of Solvency II
Massimiliano Neri, Associate Director, Moody’s analytics
Solvency II 2
Agenda
Solvency II 3
Major barriers to improved risk management…
After the storm: A new era for risk management in financial services The Economist Intelligence Unit Limited 2009
Solvency II 4
Introduction
provisions
high quality data
Solvency II 5
Agenda
Solvency II 6
Data Quality Assessment (1/3)
statistical techniques to calculate technical provisions (including data employed in setting specific assumptions)
Best Practice: to conduct DQA compared with data from other LoB or risk factors
Solvency II 7
Data Quality Assessment (2/3)
consideration
risks and trends
Its assessment must be conducted compared with data from other LoB or risk factors data and consistency checks
Solvency II 8
Data Quality Assessment (3/3)
Solvency II 9
Agenda
Solvency II 10
The Data Quality Management Process
Data Quality Management
Data Definition
Data Quality Assessment
Problems Resolution Data Quality Monitoring
Best Practice: to monitor data quality as frequently as possible
Solvency II 11
Data Identification, Collection and Processing
Requirements:
Best Practice: to accumulate as much historical data as possible
Solvency II 12
Auditors and the Actuarial Function
data
examinations of selected datasets in order to determine and confirm that the data is consistent with its purpose
Solvency II 13
Identification of Data Deficiencies
Reasons for bad data quality: a) Singularities in the nature or size of the portfolio b) Deficiencies in the internal processes of data collection, storage or data quality validation c) Deficiencies in the exchange of information with business partners in a reliable and standardized way Assessment of the reasons of low data quality in order to increase quantity and quality
Solvency II 14
Management of Data Deficiencies
Best Practice: to apply adjustments in a controlled, documented and consistent way
Solvency II 15
Agenda
Solvency II 16
A Centralized Approach to Data Quality: Pattern 1
Results Data Scenario
ETL Data Source Systems
Data Quality Assessment Before data import …..
Solvency II 17
A Centralized Approach to Data Quality: Pattern 2
Results Data Scenario
ETL Data Source Systems
Data Quality Assessment During data import …..
Solvency II 18
A Centralized Approach to Data Quality: Pattern 3
Results Data Scenario
ETL Data Source Systems
Data Quality Assessment After data import …..
Best Practice: to import all the data, even low quality, in
assess low quality data and take appropriate decisions
Solvency II 19
Types of Data Quality Checks
[1] The importance of reconciliation with accounting data is recognized in the regulation in CEIOPS (CP 43/09), 1.3. Technical Checks
The ‘book code’ of an insurance policy does not correspond to any entry in the ‘deal book’ table
Functional Checks
value date of a policy
Business Consistency Checks
The value of the ‘premium periodicity’ must be consistent with the type of policy
General Ledger Reconciliation
value field ‘the comma’ disappeared, leaving all values multiplied by 100
exchange rates to data
Solvency II 20
The Data Quality Assessment Process and the User
Best Practice: to express data checks in natural language
Solvency II 21
The Data Quality Assessment Process and the User
Best Practice: to allow the user to assess the quality of the data through a user friendly environment
Solvency II 22
The Data Quality Assessment Process and the User
Best Practice: to analyze inconsistencies detected by data quality checks at different levels of granularity
Solvency II 23
Agenda
Solvency II 24
Conclusions
1. The average insurance company is unprepared for the data quality requirements of the new regulation. This is due to three factors:
a) The actuarial function is seldom used to apply its professional judgment to the available data for the calculation of best estimates b) Some insurance companies have been accumulating historical data for many decades. However, data has been usually collected for daily operations, rather than for the calculation of the technical provisions c) Insurance IT legacy systems are often outdated and organized in multiple silos across different departments; this causes duplication of data and inconsistency of values
2. Data Quality Assessment is the core requirement 3. Moody’s Analytics best practices improve data quality using an enterprise risk management approach
Solvency II 25
Thank You
Visit our stand in the Exhibition Area Visit us at moodysanalytics.com
Solvency II 26