EXAM 5 Concepts
Describe diagnostics used to test GLMSs
- Standard Errors (narrow SEs suggest the variable is stat. significant) - Deviance tests (chi-squared, F tests; used to compare models with different variables) - Consistency over time (estimating parameters on data sets for consecutive time periods to see if stable) -Validation using a holdout sample
Describe approaches that can be used to select a tail factor
- special study that contains more years of data - using an industry benchmark tail factor - fitting a curve to the LDFs and extrapolating the tail factor - judgment - rept. to paid ratios at the latest paid development period
Identify issues a rate capping rule may cause for an insurer
- Capping rate changes on some insureds may lead to the total rate change being different than the targeted rate change. In that case, the rates may be inadequate (or excessive). - The capping rule would need to be programmed into the insurer's computer systems, which can get quite complicated. For example, while the insurer may want to cap the rate change impact, they may not want to limit the rate impact of other changes on renewal such as coverage changes or claim surcharges.
Common diagnostics for GLMs.
- Chi-square tests, F-tests, other deviance tests -Confidence intervals (smaller for more data) -Running model on separate consecutive time periods (see if consistent over time) -Using a holdout sample to see if the model is over- or under-fitting the original dataset -Judgmentally deciding if results are reasonable (counter-intuitive results)
Identify two situations when the rept. BF and Development Methods result in the same estimate of ultimate claims.
- In a steady state environment where the expected LR for the BF is based on the historically stable loss ratio - When there has been a change in the rate of settlement of claims (and all else has been steady), neither technique will be affected, so both will produce the same result
Discuss two reasons why it would be acceptable to maintain an imbalance in the fundamental insurance equation at the individual or segment level.
- Regulatory constraint (limits on rate changes or rate differentials across segments) - If the insurer was using an asset share pricing approach, in which they would look at a longer term view of profitability (initial costs of writing the policy could be outweighed by future profits on the policy)
Describe reasons why a company might propose a rate increase substantially lower than indicated.
- Regulatory reasons: restrictions that don't allow companies to take over an X% change, may be required to notify policyholders with a rate increase of X% or more - Operational reasons: costly to implement (particularly when changes to rating algorithm) - Customer Retention - Competitive Reasons: target NB - Longer-term pricing reasons: lifetime value of customer; asset share pricing (long-term profitability)
Name considerations when conducting a rate indication
-Adjust historical premium for one time changes, such as rate changes -Develop losses and ALAE to their ultimate levels -Trend losses and ALAE to future levels (adjusts for changes in mix of business and for trends in freq/severity) -Trend premium to future levels -Include a loading for ULAE -Separate out Fixed/Variable Expenses -Remove cat/shock losses from data and replace with long-term average -Adjust losses for benefit level changes
Discuss actions a company could take to offset the pricing shortfall of implementing a rate change substantially less than indicated.
-Expense reductions (reduce advertising, lay off employees) -Loss cost reductions (incentives to educated drivers about safety features, reduce coverage without reducing rates) -Increase investment income/adopt more aggressive strategy -Reduce UW profit target (lower profit provision) -Legal action to challenge regulations -Shift to more profitable business (focus marketing on groups priced appropriately, change UW guidelines to write more profitable business)
Discuss when the pure premium method is preferable to the loss ratio method.
-For a new line of business, the PP method is preferable since there is no existing rate. -When new rating variables are introduced and they are not available in the historical dataset and thus it's impossible to on-level the premium.
Identify possible ITV initiatives
-Have a coinsurance clause so losses are reduced for insureds with partial coverage (the lower losses due to the clause would now match the lower premium from the lower coverage level purchased) -Vary rates by ITV levels (premiums can match expected losses) -Order property inspections to get more accurate inputs for RC estimation
Identify the types of insurance environments which may discourage use of multivariate methods.
-If regulators prohibit the use of multivariate methods -If detailed data or sufficient computing power is not available -If the volume of data is very low, the results aren't likely to be credible -If the chance for adverse selection is very low, there is less incentive to adopt more sophisticated multivariate methods
Desirable qualities of a complement of credibility
-Independent from base statistic -Unbiased -Accurate -Available -Easy to compute -Logical relationship to the base statistic
Explain why small risks generally show higher loss ratios than larger risks.
-Small companies usually have less sophisticated safety programs -Small companies usually don't have return-to-work programs for injured workers - Small companies are not impacted by or do not qualify for experience rating, so there is less incentive to prevent or mitigate injuries
Discuss why GLM analysis is typically performed on loss cost data instead of loss ratios.
-There is no reason to on-level premiums at the granular level -Actuaries have a priori expectations of frequency and severity patterns but not loss ratio patterns -Loss Ratio models become obsolete when rates are changed -There is no standard distribution for modeling loss ratios
Discuss potential issues when using historical data from policies with deductibles or limit to price deductible factors. What are potential solutions?
-These policies may only have data for the loss amount in excess of the deductible-->use loss ratio approach, GLM, LER approach with data from deductibles less than or equal to the deductible you're pricing, fit a theoretical dist'n -These policies may have loss data censored by historical limits-->LR, GLM, theoretical dist'n - Higher limits will experience higher severity trends and greater development (trend and develop before calculating deductible factors)
If you have two estimates of Ultimate ALAE from the multiplicative and additive approaches, what are some assumption considerations when deciding an ultimate ALAE selection? For this example, suppose that the most recent ratio is higher than historically at 12 months.
1) Assume higher ALAE for most recent AY is due to speedup in ALAE, so expect ult. ratio of ALAE/claims to be in line with historical levels 2) Assume a large paid ALAE claim, so expect future development to be more in line with historical level; Additive Approach won't react to this large claim 3) Assume a change in ALAE strategy such as insurer taking more aggressive stance on defending claims, so expect ult ALAE/claims ratio to be higher; Multiplicative Approach selected
Explain some reasons that there might be a change in the mix of business in a reserve analysis
1) Change in portfolio volumes- shift in average accident date and cause distortions in development factors; substitute accident quarter 2) Change in policy conditions (deductibles, policy limits)- can change size of loss dist'n, changing development; substitute policy year data 3) Change in types of business written (class, territories)- different development patterns or loss ratios; split data into more homogeneous groups
Explain reasons why there might be a change in settlement patterns in a reserve analysis
1) Change in settlement procedures (change in case o/s strength or closure rate)- use BS adjustments 2) Jury awards/legislation/regulation- change type of loss, size of loss, and/or reporting patterns; substitute report year data 3) Large claims- can distort reporting and/or payment patterns; exclude these from experience and evaluate separately
Complements in Excess Ratemaking
1) Increased Limits Analysis 2) Lower Limits Analysis 3) Limits Analysis 4) Fitted Curves (1 may be inaccurate due to low volume of data; potential bias) (2 more biased, but more accuracy bc lower variance) (3 inaccurate, biased, assumes ELR doesn't vary by limit; used by reinsurers w/o ground-up loss data) (4 more accurate, less biased, logical rel. but less independent, not easy to compute, underlying data might not be available to fit curve)
Describe two challenges in territorial ratemaking
1) Territory tends to be highly correlated with other rating variables - use multivariate methods 2) Territories are often set to be such small areas (zip codes) that data in each have very limited credibility - clustering and spatial smoothing utilized
Claims-Made Ratemaking Principles
1. A claims-made policy should always cost less than an occurrence policy as long as claims costs are increasing. 2. If there is a sudden, unpredictable change in the underlying trends, the CM policy will be closer to the correct price than the occurrence policy based on the prior trend. 3. If there is a sudden, unpredictable shift in reporting patterns, the cost of a mature CM policy will be affected relatively little, if at all, relative to the occurrence policy. 4. CM policies incur no liability for pure IBNR, so risk of reserve adequacy is greatly reduced. 5. Investment income earned from CM policies is substantially less than under occurrence policies.
Two reasons an actuary might want to split expenses into fixed and variable components
1. Ensure high premium risks are not overcharged and low premium risks are not overcharged when some expenses are truly fixed. 2. Fixed expenses may be impacted by trend, so separating them out allows this trend to be applied.
Complements in First Dollar Ratemaking
1. Loss costs of a larger group that includes the group being rated 2. Loss costs of a larger related group 3. Rate change from the larger group applied to present rates 4. Harwayne's Method 5. Trended Present Rates 6. Competitor's Rates (1 and 2 usually biased- reason you're pricing separately; logical rel. if chosen reasonably) (3 is an adjusted version of 2 to reduce bias) (4 harder to explain logical relationship) (5 accuracy depends on stability of indications) (6 independent, easy to compute, logical relationship)
General considerations for determining an expense provision
1. On-level premium 2. Premium and expenses could be trending at different rates (trend both if so) 3. Are some expenses fixed? 4. When are expenses incurred? 5. Are some expenses growing at a different rate than premium? Historical average may not be appropriate to reflect future expense costs
Insurer options to manage catastrophe exposure other than changing rates
1. Restrict UW in high risk areas. 2. Require higher deductibles in these areas. 3. Purchase reinsurance
Fully justify the use of lifetime value analysis in a rate indication using the SOP regarding P&C Ratemaking
A rate should provide for the expected value of all future costs associated with an individual risk transfer. Lifetime value analysis considers all costs, calculates the PV of those expected future costs and looks at a specific sample policy in a class to ID those costs for an individual risk.
Explain difference between expense constant and loss constant.
An expense constant accounts for fixed expenses that do not vary by policy size. They are important for small policies because the premium may be so small that it may not cover the expenses of writing the policy. A loss constant is added to the premium for small (or all) policies to account for the fact that small risks have worse expected loss experience than large risks.
Considerations for selecting trends company state trends when given state and CW company data as well as industry data
Benefits of state data: different types of insureds between states and insurers, differences in laws and regulations, differences in operating procedures R-squared/goodness of fit considerations Basic/total limits information (want BL trend to be less than total limits trend)
Give an example of exposure correlation that may distort univariate approaches
One-way analysis may show that older cars have high claims experience relative to newer cars. This may be distorted because older cars tend to be driven by younger drivers with worse claims experience.
Explain the value of using frequency-severity techniques (particularly compared to development techniques)
Can explicitly reflect inflation in the projection; gain greater insight into the claims process (reporting, settlement, avg. cost of claims); can be used with paid data only, so independent of case o/s changes. Development techniques can be unstable and inaccurate in the most recent AYs. F-S techniques are able to break claims down into f/s components. Claim counts tend to be stable and good estimates of ultimates can be made.
Discuss considerations for an actuary when selecting ultimates for a block of business
Complete understanding of the data (trends and changes such as mix of business, underlying limits, reinsurance; external changes such as law/regulations; operational changes such as case adequacy, closure rates, definition of claim counts). Credibility, data smoothing (volume, consistency of data). Retrospective testing. Diagnostic view of projected ultimate values (claim ratios, frequency, severity, PP, avg. unpaid claims) to reasonability check selections by looking at year to year progressions
Describe the difficulty of fairness in risk classification
Differences in prices should reflect differences in expected costs. There should be no subsidy among the classes. Risks within a class should have the same expected costs. It is difficult to identify the risk characteristics and establish classes to do that. Another difficulty is from the social aspect (socially unacceptable groupings)- balance between individual equity and social adequacy.
Describe possible changes from new growth and the effect on the overall indication.
Exposure growth will impact the parallelogram method (more weight should be given to later experience). High growth, particularly in a new territory also has potential to increase losses. NB typically has higher loss costs. Unfamiliar business to UW'ers could result in higher losses, and create a need for added staff, which if not met, would lead to higher losses since they wouldn't spend as much time UW risks. All these would lead to higher indications in the future. One factor that could lower indications is potential for expense reduction due to growth.
Describe a difference using a GLM and a univariate approach using limited average severities
GLMs do not assume that frequency is the same across all limits, so they can reflect adverse or favorable selection. This may produce counter-intuitive results such as expected losses decreasing as limits increase.
Discuss how the premium rate changes as policy face value increases. Discuss how the rate is impacted if large losses predominate.
If all losses are not the same size, premium rate declines as policy face increases. If large losses predominate, pure premium rates should decrease at an increasing rate as insurance to value increases.
Describe how insurance to value affects homeowners rate level adequacy.
If the average ITV of the book of business is less than assumed in the insurer's rates, then the rates will be too low (in the absence of a coinsurance clause or rates varying by ITV).
Why might insurers be willing to write policies with a negative UW profit margin?
If they have a high investment return, they can still achieve a positive total profit (particularly for long-tailed lines).
Impact on IBNR when using the Reported BS Adjustment compared to using the Reported Claim Development Technique
Impact depends on whether the adjusted case reserves are higher or lower than unadjusted case reserves. - If adjusted reserves are higher, then LDFs from adj rept. triangle will be lower, resulting in less IBNR
What are the reasons GLMs have grown in popularity? What are the benefits?
Increased computing power, better data availability, and competitive pressure to avoid adverse selection Benefits: properly adjust for exposure correlation, attempt to focus on the signal (systematic effects) and ignore the noise, provide statistical diagnostics, allow for consideration of interactions between rating variables (response correlation)
Describe how the LER approach doesn't recognize behavior differences of insureds and the effect on deductible credits. Discuss one way to recognize these behavior differences
LER approach assumes that insureds at different deductibles have the same claiming behavior. Low-risk insureds may choose higher deductible because they realize they're less likely to have a claim. LER approach is purely severity based; higher deductible policies may end up being more profitable. Using a GLM approach would recognize behavioral differences since GLMs do not assume frequency is same for all risks.
Basic Limits Loss (definition and use)
Losses capped at basic policy limits; using these is one way to limit impact of large individual shock losses on the rate indication
Discuss why it is inappropriate to use WP at historical rate levels to determine premium trends.
One-time rate changes would be picked up by the trend, though we don't expect these to continue into the future. This could lead to inaccurate projections.
Overlap Fallacy
No overlap between trending and developing losses; Development brings data from each historical period to their ultimate level, while trending reflects the difference in ultimate levels from one historical period to the next. Want to make sure the policy is priced to cover ultimate losses, but also that the ultimate losses are at the cost levels of the future policy period
Explain the general role of credibility in ratemaking.
Ratemaking involves using adjusted historical data to predict future costs. To the extent that the historical data is volatile or small in size and thus not fully reliable, credibility can be used to give less weight to historical data and give some weight to other related experience in order to improve the estimate of future values.
Compare schedule rating and retrospective rating with respect to providing incentive to insureds to control losses and providing stability in the premiums charged
Retrospective-more direct financial incentive for insureds to control losses since reduced losses result in lower premium for the current policy term; whether a schedule credit will be given is uncertain and smaller in magnitude. Schedule- more stable premiums since credit/debit is determined at start and doesn't change
Explain the concepts of credibility, regression analysis, and data smoothing in actuarial reserve analysis.
Selecting age-to-age and tail factors: credibility and data smoothing are considered using several years of data and looking at different averages. Curves are fit to LDFs and extrapolated for tail factors. Actual data may be judgmentally weighed with industry factors. Trend factors used when analyzing freq, sev, PP: regression analysis used to select trends. Credibility-weight with industry data. Cred and data smoothing considered when selecting number of years to include. Selection of claim ratios- same explanation as trend Adjustments for operational changes- for BS method, regression analysis used to select trends and estimate paid claims for a given claims closure rate
Describe the goal of ratemaking and the difficulty in pricing insurance vs. non-insurance products.
The goal of ratemaking is to balance the fundamental insurance equation (P=L+LAE+UW Expenses + UW Profit). Rates should be set to cover all costs and achieve target UW profit. It is prospective and balance should be attained at both aggregate and individual levels. Difficult because cost is unknown before the product is sold.
Describe how distortion can occur using a univariate approach.
The univariate PP approach does not account for exposure correlation with other rating variables. As a result, there may be a double counting effect with other correlated variables. If there is exposure correlation, the results of the univariate analysis are distorted since some of the experience of other rating variables can be picked up in the variable being priced.
Differences between Asset Share Pricing and Pure Premium Ratemaking
Traditional ratemaking techniques only consider the experience of a single period of time. As such they fail to consider differences in persistency between risks. Persistency can have a significant impact due to loss and expense differences between new and renewal business. The asset share pricing model accounts for this by introducing multiple periods, persistency, and different assumptions for new and renewal business.
Explain why we adjust premium for one time changes and why we use average written premium rather than earned to analyze the trend
We don't expect one time changes to continue into the future, so we need to adjust for them so they aren't picked up by the trend. Using WP allows you to incorporate more recent data into the analysis; trends in WP are expected to show up in EP
Discuss why a premium trend should be utilized in a rate level indication.
We need to predict what future amount of premium is expected in order to compare it to the future amount of premium required in order to determine how much rates should change. Premium trends adjust for premium differences between historical and future periods based on things like mix of business changes and socio-economic trends.