IDMA2

Ace your homework & exams now with Quizwiz!

What are the eight considerations for a well-planned, quality insurance data system as identified in the reading "Planning for Insurance Data Quality"?

1) Accuracy 2) Completeness 3) Timeliness 4) Validity 5) Evaluation 6) Communication 7) Flexibility 8) Simplicity

What are the five benefits of audits identified in the CAS's "White Paper on Data Quality"?

1) Check the accuracy and completeness of the data 2) Help to ensure consistent handling of data 3) Help to determine the quality of systems control procedures 4) Help to measure and improve timeliness of data 5) Help to increase the reliability of results

The Data Quality Certification Model for Insurance Data Management lists four tools or means that can be used to ascertain the accuracy of data. What are they?

1) Checks that independently compare the reported data to the source data 2) Periodic tests to guarantee the accuracy of any encoding process 3) Premium and claim matches to verify the consistency of the reported data 4) Results of functional audits such as premium audits, market conduct examinations, claims audits

What are the 10 tools and products identified in the reading "Planning for Insurance Data Quality" as being available for data quality checking of company reports to external organizations?

1) Company Edit Packages (CEP) 2) Pre-edit service for submissions 3) Submission Analysis Reports (SAR) 4) Detailed error listings of submitted data 5) Company Performance Reports 6) Pre-delinquency lists and policy information lists 7) Distributional Edit Reports (DER) 8) Reconciliation reports to annual statement experience 9) Data quality criticisms on Unit Statistical Reports (USR) 10) User guides for statistical reporting

What are the three current technology topics that the CAS's "White Paper on Data Quality" identified as affecting future data quality efforts?

1) Data Warehouse Concept 2) Greater use of complementary databases 3) Pattern Recognition/Expert Systems/Fuzzy Logic Systems

Describe the two components of data collection identified in the paper "Guidance Regarding Management Data and Information" and the quality principles for each of them.

1) Data capture • Data requirements should be compatible and consistent • Data elements should have only one meaning • Common data elements should be defined similarly • Flexibility should accommodate expansion of data elements • Codes should meaningfully represent information • Consider frequency of update • Use codes that are established and understood when possible 2) Data quality control • Data quality function should be established • Standards of data quality should be developed and monitored • Critical processing points should be identified and control procedures at these point should be developed • Edits should be installed to check accuracy, validity, and reasonableness and these should be performed as closely as possible to the data entry with error correction as close to the point of discovery as possible • Balancing and reconciliation procedures and standards should be established • Monitor data quality on an ongoing basis • Changes must be thoroughly tested

The paper "Guidance Regarding Management Data and Information" identified three areas in which actuaries could be making significant contributions. List and describe them.

1) Data collection - must collect appropriate data with data collection capabilities considered. The data capture principles require sound data element and code design. Data quality control ensures that data being collected is accurate and complete and captured in a cost effective manner. 2) Data design - managing data as a critical resource and therefore designing data to avoid redundancy, to be consistently derived, consistently defined, shareable, efficiently organized to maximize use and value, and flexible to respond to different requests. Principles include: central database, detailed databases, data dictionary, database designed as described above. 3) Management information considerations - consider how the data will be used and the different types of data needed in different levels of detail for different purposes.

What are the five data design principles and concepts identified in the paper "Guidance Regarding Management Data and Information"?

1) Data should be managed as a critical resource 2) Data should not be redundant 3) Data should be consistently defined, derived, and shareable 4) Data must be efficiently organized to maximize use and value 5) Designs should be flexible to respond to different requests

According to the CAS's "White Paper on Data Quality", what are the six steps that an insurance data manager should take if errors or imperfections are detected in the data?

1) Determine the reasons and cause(s) for the error 2) Inform the actuary undertaking the current study and incorporate needed adjustments, modifications or corrections to the source data for use in the current analysis 3) Stop the error by fixing the system or revising the data handling and collection process 4) Quantify, if possible, the impact and magnitude of the error on the data underlying the current study 5) Describe if the error may materially impact prior analyses and whether these prior analyses may need to be retroactively corrected 6) If materially significant, make disclosures about past analyses appropriately

List and describe the six major components of the Data Quality Certification Model commentary.

1) Disclosure of performance results of checks for validity, accuracy, reasonability, and completeness of data 2) List of performance reports and/or monitoring tools used in ascertaining the quality of data 3) Review and analysis of significant data problems identified using the data quality tools 4) Plan of action to correct data problems including follow-up to verify completion 5) Assessment of materiality of data elements 6) Statement that certifies that the commentary is true, accurate, and complete

The Data Quality Certification Model lists five tools or means that can be used to ascertain the reasonability of data. What are they?

1) Distributional analysis 2) Profiles of expected results 3) Trend analysis 4) Average rate checks 5) Loss ratio analysis

According to the CAS's "White Paper on Data Quality", what are the four key tests or checks that should be considered in a review of the reasonableness of the data?

1) Distributional edit review 2) Consistency checks 3) Statistical tests 4) Industry comparisons

What are the 11 basic key skills and knowledge of a data manager, as identified in "Insurance Data Management - Defining The Profession"?

1) Extensive knowledge of insurance company operations 2) Understand the business needs for data 3) Skilled in the application of data definition and data quality audit standards 4) Comprehend and follow logical procedures in data base organization 5) Skilled in leading data modeling and enterprise modeling sessions 6) Understand theory and technology 7) Display initiative and creativity in strategic data planning 8) Skilled in presenting clear alternatives and solutions 9) Ensure constructive participation from colleagues 10) Handle diverse assignments 11) Ability to balance the ideal and the practical

In "Insurance Data Management - Defining The Profession", there are four data management functional areas identified in which core data managers are involved. What are they?

1) External data reporting 2) Internal data coordination 3) Information systems development 4) Data administration

What are the four specific questions identified in the paper "Planning for Insurance Data Quality" that are used to ascertain information needs for reserving?

1) How are the data to be aggregated by report period and evaluation date? 2) What are the detailed groupings? 3) What calculations are to be done? 4) What is the degree of standardization, and are there ad hoc reports?

The Data Quality Certification Model lists two tools or means that can be used to ascertain the completeness of data. What are they?

1) Identification of the significant discrepancies in the reconciliation results and monitoring of the corrective actions 2) Documentation that compares statistical data with financial data and explains the differences

According to the CAS's "White Paper on Data Quality", what are the five benefits of recognizing that data is an asset?

1) Improved business opportunities 2) Greater fraud detection 3) Enhanced underwriting review 4) Greater evaluation of loss control factors and risk management procedures 5) Greater ability to use the data in actuarial analyses

List and briefly describe the five critical success factors of an insurance data management function, as identified in "Insurance Data Management - Defining The Profession".

1) Perception - the ability to anticipate and understand business conditions as they relate to reporting data 2) Communication skills - creating new ideas, listening, writing, speaking, encoding, and understanding 3) Staffing development - training, education, goal-setting, and instilling a stakeholder attitude 4) Technological awareness - having up-to-date knowledge and access to new information 5) Ability to control data - developing procedures and tests for data integrity

Describe the six basic components of the Statistical Data Monitoring System (SDMS) identified in the CAS's "White Paper on Data Quality" and explain why they are a good model for addressing the reliability of data and the disclosure of any data quality issues.

1) Process description and review of control procedures - provides understanding of process and basis for research of causes of error and corrective actions 2) Detailed data verification via sampling tests - provides for validity, accuracy, and completeness measures and follow-up cause and correction of problems 3) Summary data verification via reasonability reviews - provides for consistency and reasonability reviews of summary data and therefore provides confidence in the overall integrity of data 4) Financial reconciliation - provides for completeness and financial integrity of data 5) Annual review and certification - provides confidence that an ongoing process to ensure the reliability of data is being carried out 6) Review and evaluation by state examiners - provides for independent review by outside auditors and is a basis for improvement of the audit process

What are the eight elements or characteristics of successful audits identified in the CAS's "White Paper on Data Quality"?

1) Proper planning 2) Measure results using standards 3) Statistically-sound sampling techniques 4) Check source to end product and end product back to source 5) Verify data according to their intended use and definition 6) Audit the data preparation and data entry processes 7) Determine whether the company's entire process detects errors adequately and corrects them properly 8) Provide adequate documentation of results and recommendations for improvements (if any) and follow-up implementation review

Briefly describe the kinds of data and information needed for each of the six major insurance functions as listed in the paper "Guidance Regarding Management Data and Information".

1) Ratemaking - premium and exposure on a written or earned basis; loss and claim information in the same business categories as premiums and showing historical loss development patterns; and expense information; all by calendar/accident year, report year, or policy year. 2) Reserving - there are premium and loss reserves. Data for premium reserves vary for each type of reserve. For losses, the dimensions of accident, report, or policy periods with their historical development are needed by various business data groupings and showing various counts and money amounts. 3) Underwriting/marketing - distribution of current book of business and trends; underwriting results by distribution system; transaction types, use of modifications, changes in premium. 4) Claims - claim counts by type of transaction, open claims, and closed claims by various business data groups, reported weekly, monthly, quarterly, year-to-date, and latest twelve months, and showing lag times. 5) Financial analysis/investments - cash flow analysis requires all current cash items and liabilities with trends; operating results requires all financial activities such as mix of current investments, premium income, loss and loss expense, dividends, taxes, and other expenses. 6) Financial reporting - information to meet financial reporting obligations of statutory reporting, trade associations and bureaus, shareholder reporting, and income tax reporting, such as direct and next calendar period premiums, losses, expenses, and investments.

According to the Actuarial Standard of Practice No. 23 on Data Quality, what are the six disclosures that should be included in the actuary's report?

1) Sources(s) of the data 2) Materiality of any potential biases due to imperfect data 3) Adjustments or modifications made because of imperfect data 4) Extent of reliance on data supplied by others 5) Limitation on use of work product because of insufficient review of data 6) Any unresolved concern about the data that may have a material effect on the work product

According to the Actuarial Standard of Practice No. 23 on Data Quality, what are the six considerations the actuary should use in selecting the data for an actuarial analysis?

1) The data elements that are desired and possible alternative data elements 2) Appropriateness and currency of the data 3) Reasonableness and comprehensiveness of the data 4) Limitations of the data and any modifications or assumptions needed in order to use the data 5) Cost and feasibility of alternatives 6) Sampling methods

What are the six data design concepts identified in the paper "Guidance Regarding Management Data and Information"?

1) The ideal repository of data is a single central database 2) The database should contain all data elements needed for internal and external users 3) A data dictionary helps to ensure consistency 4) The data design should feature a balance of low redundancy, fast processing, flexible access, and low storage costs, and summarized or segmented data should be updated automatically from the central source 5) Databases should be flexible and organized to facilitate ad hoc report requests and direct user access 6) The retention period needs to accommodate meaningful analysis and legal and regulatory requirements

What are the eight basic questions used in ascertaining insurance information needs identified in the paper "Planning for Insurance Data Quality"?

1) What are the basic types of data and information access needs? 2) Who are the current and potential users of the information? 3) What process should be used to select the data and method of collection? 4) What are the timeliness considerations? 5) How accurate do the data need to be? 6) How complete do the data need to be? 7) Are all the needed data and information available and when? 8) What are the access considerations?

What are the six specific questions identified in the paper "Planning for Insurance Data Quality" that are used to ascertain information needs for ratemaking?

1) What is the level of detail required, and is there variation by line of insurance? 2) Is the basis for aggregation Accident Year, Policy Year, or Calendar Year? 3) What supplementary data is required for adjusting premiums, exposure, losses and/or claims? 4) What expense information is needed? 5) What calculations and inferred fields need to be made? 6) What is the process from receipt of data to final product?

What are the four specific questions identified in the paper "Planning for Insurance Data Quality" that are used to ascertain information needs for underwriting and marketing?

1) What type of statistical information and level of detail are necessary to accept or reject the risk? 2) What non-insurance information (e.g., credit reports, motor vehicle reports) is necessary? 3) What information is required to assist in determining the underwriting management philosophy of the company? 4) What unique requirements does the marketing area have?

Redman states that although data underlies almost everything we do, we still don't think much about data or data quality, and so poor data quality can be insidious. He lists three reasons for this - what are they?

1.) Although virtually every activity the Information Age organization undertakes requires data, the primary results are not data. 2.) Data are invisible - you don't really touch data per se. 3.) Individuals may recognized that poor data hinder their work, but few are concerned enough to worry about how the next person will be impacted by the data he or she creates.

Data consumers have certain expectations about websites. According to Redman, these expectations can be separated into six categories. What are they?

1.) Privacy 2.) Content 3.) Quality of Values 4.) Presentation 5.) Improvement 6.) Commitment

Redman identified 5 reasons why building data quality into a data warehouse is so challenging. What are they?

1.) There are different customers for data warehouses than for operational systems. 2.) The underlying decision -making chains may be more poorly understood than are operational processes, further complicating the understanding of customer needs. 3.) In operational systems, new data tends to be more important than historical data - in data warehouses, historical data is important too. 4.) Many warehouses draw data from numerous operation sources, which can be difficult to standardize. 5.) Data must be cleaned-up both upon entry into the data warehouse and day-in and day-out, which can be a daunting challenge.

Distributional Edit Review (DER)

A Distributional Edit Review (DER) compares one set of data to a prior quarter's or year's data for consistency; a review of summary data by key field relative to a profile of that data based on prior experience.

Data Dictionary

A data dictionary is a resource which should contain all the reported data elements as per the required definitions of the internal and external users of the system.

Accuracy

A data element is accurate if it is a true reflection of the source records of the organization (the source records are the first recorded evidence retained by the company).

Data Element

A data element is an item of information, such as date of birth or risk classification.

Validity

A data element is valid if its value is one of the allowable ones.

completeness

A transaction is complete if each transaction contains the necessary data for the business needs of the transaction, each transaction is processed through the necessary portion s of the systems, and each transaction is processed once and only once.

According to Redman, which of these describes the goal of getting to a common definition of customer? A. Almost impossible B. Necessary to the business C. Easy, but only if done the right way D. Most important to companies pursuing the product leadership strategy

A. Almost impossible

32. For which of the major insurance functions are premium, exposure, and loss information showing historical loss development patterns MOST needed? A. Ratemaking B. Reserving C. Underwriting/marketing D. Claims processing

A. Ratemaking

What is the origin of quality cost measurement and the necessity of continuous improvement? A. Manufacturing industries B. Service industries C. Financial Industries D. The quality gurus

A. manufacturing industries

Absolute Accuracy

Absolute accuracy means that the data are 100% correct - every data element on each and every transaction record is properly and accurately coded.

Explain the concepts of absolute accuracy, effective accuracy, and relative accuracy as described in the CAS's "White Paper on Data Quality".

Absolute accuracy means that the data are 100% correct - every data element on each and every transaction record is properly and accurately coded. Effective accuracy means that there are some imperfections in the data but they are generally usable in most analyses. There are two categories or types of effective accuracy: the erroneous data is at a level or in a data element that does not impact the analysis at hand, or the imperfect data does not have a material impact on the results of the analysis. Relative accuracy means that data may be coded inaccurately as to its definition but is reported consistently over time.

Information

According to English information is data in context. It is a function of data and its definition and presentation.

Data

According to Redman, data consist of data models (which define what the data is about) and data values (which are assigned to attributes in the data model). According to English, data is the representation of facts about things.

Audits

Audits are statistically sound checks of data that verify data from source to end product and from end product back to source; that verify data according to their intended use and definition; that check the data preparation and data entry processes; that determine whether the error detection and correction process is adequate; and provide adequate documentation of the results with recommendations for improvement and follow-up implementation review.

According to English, what is the estimated business cost of nonquality data? A. 10% to 25% of IT budget B. 10% to 25% of revenue C. 3% to 5% of revenue 8% to 10% of profits

B. 10% to 25% of revenue

According to the Data Quality Certification Model for Insurance Data Management, the results of functional audits are a principal tool in ascertaining which of the following? A. Validity B. Accuracy C. Reasonability D. Completeness

B. Accuracy

Which of these summarizes Redman's statement about poor data quality? A. It is impossible to cost B. It is a contributing cause to the organization's most important business problems C. It can be seen and reported on by everyone in the organization D. Most companies affected by poor data quality have quantified the impact, but believe that fixing it is too expensive.

B. It is a contributing cause to the organization's most important business problems.

41. Which of the following is NOT listed in the reading "Planning for Insurance Data Quality" as one of the three types of documentation that need to be maintained in support of data quality? A. Data dictionary B. Regressive tests C. Systems development D. Data quality program

B. Regressive tests

What is the scope of the Data Quality Certification Model for Insurance Data Management? A. All insurance policy information B. Statistical data C. Materially significant data, such as data used in strategic planning D. Data used in reports or products provided to customers

B. Statistical data

Who is responsible for developing the standards to be used in the Data Quality Certification Model commentary? A. Business managers B. Users of data C. Corporate data managers D. Internal auditor

B. Users of data

42. Which of the following is NOT one of the four latest industry information trends listed in the reading "Planning for Insurance Data Quality"? A. Internet access and on-line availability of information B. Data warehouse approach to data design C. Increased use of cloud-based computing D. Non-insurance information as part of the decision-making process

C. Increased use of cloud-based computing

According to "Insurance Data Management - Defining The Profession", which of the following tasks is included in the external data reporting function? A. Charting the relationship between data needs and resources B. Preparing system specifications C. Prepare basic cost analyses D. Working with systems developers to maximize use of technology

C. Prepare basic cost analyses

Which of the following is identified in the reading "Planning for Insurance Data Quality" as a data quality recommendation for internal data collection mechanisms? A. Establish a committee of the business and systems groups involved in managing the data collection systems to coordinate data requirements B. Have all data collection systems use the central data dictionary for editing and to display the meaning of codes C. Standards and guidelines for data elements should be strictly followed wherever possible D. Combine all like data entry processes to centralize the management and control of data collection mechanisms

C. Standards and guidelines for data elements should be strictly followed wherever possible

Why are CEOs aware of the impact of data quality?

CEOs are aware of the impact of data quality because CEOs know that so much is at stake for the business and its customers.

CEO

Chief Executive Officer

CFO

Chief Financial Officer

CIO

Chief Information Officer

Customer Data

Customer data is data that directly impacts the customer, data that customer see, or data about customers.

According to Redman, which of these ensures that metadata is well-defined, kept up-to-date, and made easily available to all? A. Data Flow Diagrams B. Data Documentation Standards C. Data Dictionary D. Data Resource Chain

D. Data Resource Chain

What is inherent information quality?

Data Accuracy. It is the degree to which data accurately reflects the real-world object that the data represents.

What are the components of information quality as defined by English?

Data Definition Quality Data Content Quality Data Presentation Quality

Data Quality

Data Quality is fitness for use (after Juran) - data are of high quality if they are fit for their intended uses in operations, decision making and planning.

Data Resource Data

Data Resource Data is data that describes the data resource, such as definition of terms, the source of data, and how to access the data. Data resource data is also called "metadata".

Appropriate Data

Data are appropriate if they contain the information needed for the analysis, are homogeneous so as to allow evaluation, and are consistent with the purpose of the study.

Comprehensive Data

Data are comprehensive if the necessary records and data elements to do a proper analysis are available.

Reasonable Data

Data are reasonable if they are consistent with prior data or other information.

Data Content Quality

Data content quality is the degree to which data "values" accurately represent the characteristics of the real-world entity or fact and meet the needs of the information customers to perform their jobs effectively.

Data Definition

Data definition is the specification of data - the definition, domain value set, and business rules that govern data.

Data Definition Quality

Data definition quality is the degree to which the data "definition" accurately describes the real world objects and meets the needs of the customers to understand the data.

Data is reusable

Data is the only resource that is completely reusable, so that redundancy should not be needed.

Data Presentation Quality

Data presentation quality is a characteristic and measure of data access by and presentation to business personnel for their use in performing their work.

Data Quality Control

Data quality control is the aspect of data collection that ensures that data which are being captured, processed, and reported are accurate, complete, and collected in a cost-effective manner.

According to English, why have information quality issues not be addressed?

Data quality is not a sexy topic, and management has either deemed the costs of the status quo and the current level of low-quality data as acceptable and normal costs of doing business or they are unaware of the real costs of nonquality data.

According to the Actuarial Standard of Practice No. 23 on Data Quality, what is the definition of comprehensive data?

Data that contains all data elements or records needed for the analysis.

According to English, what is the cause of data warehouse failures?

Data warehousing projects fail for many reasons, all of which can be traced to a single cause: nonquality

Detailed Data Verification

Detailed data verification is the verification of every data element on a random sample of premium and claim transactions, and identification of cause and correction of any errors found.

Effective Accuracy

Effective accuracy means that there are some imperfections in the data but they are generally usable in most analyses. There are two categories or types of effective accuracy: the erroneous data is at a level or in a data element that does not impact the analysis at hand, or the imperfect data does not have a material impact on the results of the analysis.

What is the primary difference between the English and Redman definitions of data?

English uses the term "fact" in defining data while Redman prefers not to use the word "fact". Redman defines data in terms of data models and data values, which would be data in context and more similar to what English calls information.

Insurance Data Management Association (IDMA)

IDMA is a non-profit professional association of insurance data managers founded in 1984 and dedicated to increasing professionalism through education and other activities. It offers a curriculum, professional certification, seminars and forums, and special reports.

It is said that leading companies, consciously or not, pursue one of three strategies - customer intimacy, product leadership, or price leadership. From a data quality perspective, why is it important to determine which of these applies to your company?

Identifying the strategy being pursued by your company is important because the most important data are those required for executing the most important strategies, and that's where data quality efforts should be concentrated.

Over-billing can lead to a host of customer-related problems. According to Redman, what other problem may over-billing indicate?

If an organization over-bills, it is probably under-billing as well, by the same amount, or more, and that is seldom reported by customers.

According to the Data Quality Certification Model for Insurance Data Management, what are the primary considerations for the data manager in determining the materiality of data elements?

In determining materiality, the data manager considers the intended use of the data, and the importance of the data element to that use.

How does English distinguish between data and information?

Information is data in context. While data is a "fact" about a "thing", information consists of data that is defined and presented in such a way that the "fact" about a "thing" becomes understandable.

According to English, how does information quality improvement reduce business costs?

Information quality improvements reduce business costs by eliminating costly scrap and rework caused by defective data.

Information Quality

Information quality is consistently meeting knowledge worker and end-customer expectations.

Inherent Information Quality

Inherent information quality is data accuracy or correct data, meaning that the data accurately reflects the real-world object that the data represents.

According to Redman, what financial results can quality improvement projects show?

Many successful projects report, somewhat informally, cost reductions of 66% to 75% as a result of quality improvement projects.

Metadata

Metadata is a common term for data resource data.

What are the options used by actuaries in dealing with the quality of the data they are using and what is recommended by the Actuarial Standard of Practice No. 23 on Data Quality?

Persons or organizations responsible for generating, collecting, or publishing data may apply different standards of quality assurance, ranging from straightforward compilation of figures to extensive verification. Actuaries, in turn, deal with the question of the quality of data underlying their work products in a variety of ways, ranging from reliance on the data supplied by others without any checking to a complete and independent verification of the data. The standard does not recommend that an actuary audit data.

Pragmatic Information Quality

Pragmatic information quality is the degree of usefulness and value that data has to support the enterprise processes

What are Internet users most concerned about?

Privacy

Quality

Quality is consistently meeting customer's expectations.

Reasonability Review

Reasonability review consists of summary checks, edits, and tests designed to determine the reasonableness of the data; that is, do the data make sense?

Relative Accuracy

Relative accuracy means that the data may be coded inaccurately as to their definition but are reported consistently over time.

Statistical Data

Statistical data is data derived from an insurance company's business transactions, containing information about insurance coverages and the associated premium and loss experience for those coverages.

Summary Data Verification

Summary data verification is a review of the reasonability of essential data elements in summarized data, and the investigation and correction of the most questionable errors.

Systems Development Documentation

Systems development documentation consists of the requirements, specifications, capabilities, database design, data model, program development, security, and access requirements for a system.

Actuarial Standard of Practice No. 23

The Actuarial Standard of Practice No. 23 is a standard established by the Actuarial Standards Board in order to give guidance to the actuary in selecting the data that underlies the actuarial work product; reviewing the data for appropriateness, reasonableness and comprehensiveness; and making appropriate disclosures.

Data Quality Certification Model Commentary

The Data Quality Certification Model commentary is a commentary of the quality of data, developed by the data manager. It includes disclosure of performance results of checks for validity, accuracy, reasonability, and completeness of data developed by the data manager. This commentary also includes lists of performance reports, monitory tools, analysis of significant data problems, plan of action to correct data problems, an assessment of materiality of data elements, and certification that the commentary is true, accurate and complete.

Data Quality Certification Model

The Data Quality Certification Model is a framework for use in attesting to the quality of an organization's data, and provides guidelines for the data manager to use in controlling, monitoring and measuring the validity, accuracy, reasonability and completeness of data.

Statistical Data Monitoring System (SDMS)

The Statistical Data Monitoring System (SDMS) is a system of procedures designed to control the quality of data submitted to and processed by statistical agents. The objective is to assure the reliability of the data collection process for statistical data.

What is the common theme of the English and Redman definitions of data quality?

The common theme of the English and Redman definitions of data quality is "meeting customer expectations".

Data Administration

The data administration function charts the relationships between data needs and resources, and maintains key data tools - it includes data modeling and data dictionary development and maintenance.

External Data Reporting

The external data reporting function maintains statistical plans, interprets data calls, finds out what data are available and where and how to collect them, analyzes data quality, and prepares basic cost analyses.

Information Systems Development

The information systems development function maximizes use of technology to support organization goals, understands the needs of the business community, and works as a team with data coordinators.

Internal Data Coordination

The internal data coordination function analyzes internal data needs, maintains internal statistical plans, defines data elements, describes expectations of systems, verifies calculated data, ensures data, quality, and monitors for accuracy and security.

What is the three-fold purpose of the Actuarial Standard of Practice No. 23 on Data Quality?

The purpose of Actuarial Standard of Practice No. 23 is to give guidance to the actuary in: 1) Selecting data that underlie the actuarial work product; 2) Reviewing data for appropriateness, reasonableness, and comprehensiveness; and 3) Making appropriate disclosures.

What is the purpose of the Data Quality Certification Model for Insurance Data Management?

The purpose of the Data Quality Certification Model is to provide: a. A framework for use in attesting to the quality of an organization's data. It is not intended to be a manual of detailed procedures. b. Guidelines for the data manager to use in controlling, monitoring, and measuring the validity, accuracy, reasonability, and completeness of data.

According to English, what is the purpose for improving information quality?

To improve customer satisfaction and stakeholder satisfaction by increasing the efficiency and effectiveness of the business processes.

What is the difference between transactional statistical plans and summarized statistical plans

Transactional statistical plans collect data on a 'unit transaction' basis. This means that one or more premium and/or loss records are generated each time a policy is written or a loss occurs. Data are generally required on a quarterly reporting basis according to a schedule of 45-90 days after the close of a quarter. Summarized statistical plans require less detailed coding on records and generally require reporting on an annual basis. Summarized plans allow unit transaction records containing the same coding information to be combined and reported as one summarized record. Coding is required only to a minimum level of detail.

What is Redman's recommendation for developing a case for data quality?

You can develop a case for data quality by demonstrating how improvements in data quality will lead to competitive advantage.

Insurance Data Managers

are a group of people within insurance organizations whose primary day-to-day function is to provide business managers with the information they need to accomplish the goals and objectives of the organization. They must 1.) understand the needs of their customers; 2.) see their task from the customer's perspective; and 3.) have cross-functional expertise and training.

Why are CFOs concerned about data quality?

because all their knowledge of company finance comes from the data.

According to Redman, why should everyone be concerned about data quality?

because only people who need not worry about it are those who neither create nor use data. No one participating in any modern economy can make that claim.

For what data is the CIO responsible?

data resource data and data warehouse data

Why is the quality of Internet data so important?

for gaining and sustaining competitive advantage.

According to Redman, what is the primary reason for billing issues?

is that there are so many hand-offs in the end-to-end billing chain that are not properly managed.

Materiality of Data Elements

is the measure of their importance based on the intended use of the data and their presence on individual transactions.

According to Redman, what is the most frequently-cited issue with the quality of customer data?

the high rate of return for direct mail


Related study sets

Quizlet Manhattan Advanced GRE Words

View Set

AP Government - Unit 1 Practice Question

View Set

Crainial Nerves, Spinal Cord and Spinal Nerves

View Set