How Predictive Modeling Has Revolutionized Insurance

June 18, 2012
insurance predictive modeling

The use of predictive modeling has forever changed the way insurance policies are priced. The revolutionary tool allows insurers to design ever-more-sophisticated models that tap ever-more-detailed data sets to refine precisely how much each customer should be charged.

Casualty actuaries got an overview of how far the revolution has come and how it will continue to change insurance pricing at the session “The Revolution and Evolution of Predictive Modeling” presented at the Casualty Actuarial Society Spring Meeting, in Phoenix recently.

Claudine Modlin, a senior consultant at Towers Watson, laid out how far predictive analytics has advanced insurance pricing in the past decade. Steven Armstrong, a fellow of the Casualty Actuarial Society, laid out a variety of ways those same tools and skills could improve insurance operations beyond the pricing function.

At the end of the 20th century, Modlin said, insurers were still bound to mainframe computers and highly aggregated data sets. Rating plans were less sophisticated and it was easy for a company to understand its competitors’ plans. And the rating plans were finalized based on the collective judgment of underwriters and actuaries, with little data-driven guidance in how and where to deviate from the expected costs.

Today, insurers use a variety of predictive analytic tools to hunt through gigabytes of data to find variables – sometimes non-intuitive ones – that hold clues to a customer’s riskiness and purchasing behavior. Generalized linear models (GLMs) have become the global industry standard for pricing segmentation. This is due in large part to the multivariate framework, the multiplicative nature of rating plans, and the high degree of transparency in the results.

“As an industry, we have really learned a lot,” Modlin said. “We have advanced our toolkit.”

The use of insurance credit scores was one of the great new loss predictors over the last two decades and the ongoing search for the next great loss predictor has increasingly become the norm. As insurers follow the information revolution, they are improving the quality and accessibility of their internal data, investigating third party data sources, and investing more computing power to harness the information. This has led some companies to investigate thousands of predictors – including such things as what other policies an insured has, whether they pay their bills on time, and various characteristics of the area in which the risk is located. Interpreting a large list of related variables requires more refined methods.

Modelers employ a variety of techniques to cull the list of potential predictors. The process of variable reduction involves a lot of business judgment but is frequently supplemented with data mining techniques such as principle components analysis or classification and regression trees.

“Within the GLM exercise, modelers use a blend of statistical diagnostics, practical tests and our business acumen to select predictive factors,” Modlin said.

Companies looking to refine their GLMs further pay significant attention to identifying interaction variables and to mining GLM residuals in order to improve the pricing of certain high dimension variables (e.g., territory and vehicle groups).

And in auto insurance, the revolution is moving even further, as insurers start to use telematics – gathering information about a customer’s driving behavior from a device attached to the vehicle.

Information will flow in, virtually moment by moment, Modlin said. “Do you slam on the brakes? Do you peel around corners?”

As much of the industry has refined its approach to estimating loss costs, the use of science to understand customer demand lags behind. GLMs are a suitable technique for this as well. The challenge here is to capture customer attributes as well as price-related information (e.g., quote offered at new business or price change offered at renewal) that will provide useful insights into customer elasticity.

The next evolutionary stage for pricing sophistication is for companies to learn to integrate their cost estimates with knowledge of customer behavior. This can involve scenario testing possible rate changes and measuring the effect on key performance indicators, taking the effect of customer behavior into account. Scenario testing in its ultimate form involves price optimization techniques that systematically integrate cost and demand in order to indicate an optimal set of prices that meets or exceeds corporate objectives for profitable growth while staying within company constraints.

But use of predictive models doesn’t have to end with ratemaking, said Armstrong in his presentation. The models can help other aspects of the insurance organization. And as they do, actuaries can follow them, helping explain how the models work and what potential they contain.

“You have this pricing tool kit,” he said. “I want you to think beyond pricing” and help solve business problems.

For example, predictive models could likely help underwriters work more efficiently. Right now, underwriting tends to follow rules with limited flexibility. For auto insurance, for example, young drivers receiving good student discounts have to regularly turn in copies of their grades. Predictive modeling could show, perhaps, that some types of students don’t need to perpetually update, while others would.

Models could also help underwriters in other lines, Armstrong said – helping determine which workers compensation risks should be tapped for a premium audit.

Predictive modeling could also help marketing by researching what mix of social media grows the customer base or what brand attributes drive new business. The concept isn’t new to marketers, but the actuarial skill set can enhance understanding of the work.

And claims departments ‘swim’ in a vast, vast pool of data, Armstrong said, that only awaits discovery – claims diaries, records on attorney involvement, and information on service providers and adjusters. Predictive models could answer questions such as:

  • If a damaged auto gets to the body shop a day sooner, will it affect claim severity?
  • What sorts of claims are driving costs higher?
  • What sorts of claims should be reported to the special investigations unit for potential fraud?
  • Can one pick out potential fraudsters during the underwriting process?

The models could also assist sales departments (What’s the best spot to start a new agency?), human resources (How long is a new employee likely to remain with the company?) or expense management (What underwriting reports are cost-effective?).

The list of areas where actuaries could help insurers quantify and understand their operations seems limitless, Armstrong said.

“Wherever there is data, there is opportunity,” Armstrong said.

The Casualty Actuarial Society has 5,700 members who are experts in property/casualty insurance, reinsurance, finance, risk management, and enterprise risk management.

 

 

 

 

Subscribe Insurance news headlines delivered to your email.
Get a free subscription to our popular email newsletter.

Latest Comments

  • October 17, 2013 at 2:27 pm
    MelodyP says:
    Combine predictive skills with underwriting and pricing rules, should provide insurer a more sophisticated results.This, however will largely depend on the modeler's statistic... read more
  • January 14, 2013 at 11:01 am
    jx says:
    I am not an expert in insurance, but I worked for a long time in finance. The excitement around the predictive modeling rings alarm from my experience in the financial world
  • September 21, 2012 at 2:27 pm
    helen says:
    It is quite dangerous to over driven the predictive modeling. My current company is large P&C company. I haven't heard any successful story to use the predictive modeling ... read more
See all comments

Add a Comment

Your email address will not be published. Required fields are marked *

*

More News
More News Features