Following a major announcement that A.M. Best will be probing deeper into companies’ data quality to assess their risk management processes and the credibility of modeled losses, Newark, Calif.-based Risk Management Solutions (RMS) revealed strategies to help clients prepare for the increased scrutiny.
Based on its work with portfolios totaling some $25 trillion in insured value for natural catastrophes, RMS said it has shown that it is vital to address both the completeness and accuracy of exposure data. By making data quality improvements to portfolios, RMS has seen model losses change by as much as 30%percent, with more dramatic changes for individual accounts, which has important implications for risk selection, pricing, and overall risk management.
The analysis of factors significantly impacting modeled losses is based on the RMS ExposureRefine service, which provides a way to measure, monitor, and benchmark data quality. Twenty-one earthquake portfolios totaling $612 billion in exposed limit (or more than $8 trillion in total insured value), and 18 hurricane portfolios amounting to more than $1 trillion in exposed limit (or nearly $17 trillion in insured value) were assessed.
Among the strategies RMS said can be employed to both improve the quality of data and prepare companies for the additional questions they will have to answer in the A.M. Best 2008 Supplemental Rating Questionnaire from April this year are:
-Using metrics for assessing data completeness that uncover data quality issues where they matter most. For example, a location that is geocoded at the ZIP code level in a coastal area exposed to hurricanes is of greater data quality concern than one that is inland. Use of the more basic approaches can create a false sense of security.
-Rectifying systematic issues related to missing, inaccurate, and bulk coded data that lead to significant inaccuracies in model results. Making these changes often provides the best cost-benefit for data improvement efforts.
-Information used as part of the underwriting process is available but often not included in the catastrophe modeling. Missing data can potentially create a large range in loss results.
-Where multiple systems are used for capturing data, underwriting, and/or modeling, there is greater potential for process errors, which can be easily fixed once identified.
-While using coding assumptions or bulk coding can be valid, the results need to be fully audited because errors can lead to significant inaccuracies and bias model outputs.
“Companies are facing pressure from multiple angles, with balance sheets being hit and increased scrutiny over risk management processes from analysts. This means that robust data quality can no longer be viewed as a ‘nice to have’; it’s a basic necessity to remain competitive,” said Ajay Lavakare, senior vice president and managing director of data solutions at RMS.
“With the right analytics and databases, organizations can tighten up their processes and ensure their data is both complete and accurate and, crucially, focus on improving their data quality where it matters most.”
The new questions in the A.M. Best 2008 P&C Supplemental Rating Questionnaire (SRQ) due on April 1, 2009 can be found at http://www.ambest.com/ratings/methodology/srq2008.PDF.
For more information, visit www.rms.com.
Was this article valuable?
Here are more articles you may enjoy.