CS331 6B

Ace your homework & exams now with Quizwiz!

________ duplicates data across databases. A) Data propagation B) Data duplication C) Redundant replication D) A replication server

A) Data propagation

Which of the following is a basic method for single field transformation? A) Table lookup B) Cross-linking entities C) Cross-linking attributes D) Field-to-field communication

A) Table lookup

Informational and operational data differ in all of the following ways EXCEPT: A) level of detail. B) normalization level. C) scope of data. D) data quality.

A) level of detail.

The major advantage of data propagation is: A) real-time cascading of data changes throughout the organization. B) duplication of non-redundant data. C) the ability to have trickle-feeds. D) none of the above.

A) real-time cascading of data changes throughout the organization.

Quality data can be defined as being: A) unique. B) inaccurate. C) historical. D) precise.

A) unique.

The methods to ensure the quality of data across various subject areas are called: A) Variable Data Management. B) Master Data Management. C) Joint Data Management. D) Managed Data Management.

B) Master Data Management.

Data quality problems can cascade when: A) data are not deleted properly. B) data are copied from legacy systems. C) there is redundant data storage and inconsistent metadata. D) there are data entry problems.

B) data are copied from legacy systems.

Conformance means that: A) data have been transformed. B) data are stored, exchanged or presented in a format that is specified by its metadata. C) data are stored in a way to expedite retrieval. D) none of the above.

B) data are stored, exchanged or presented in a format that is specified by its metadata.

All of the following are ways to consolidate data EXCEPT: A) application integration. B) data rollup and integration. C) business process integration. D) user interaction integration.

B) data rollup and integration.

Data governance can be defined as: A) a means to slow down the speed of data. B) high-level organizational groups and processes that oversee data stewardship. C) a government task force for defining data quality. D) none of the above.

B) high-level organizational groups and processes that oversee data stewardship.

A method of capturing only the changes that have occurred in the source data since the last capture is called ________ extract. A) static B) incremental C) partial D) update-driven

B) incremental

Event-drive propagation: A) provides a means to duplicate data for events. B) pushes data to duplicate sites as an event occurs. C) pulls duplicate data from redundant sites. D) none of the above.

B) pushes data to duplicate sites as an event occurs.

Data quality ROI stands for: A) return on investment. B) risk of incarceration. C) rough outline inclusion. D) none of the above.

B) risk of incarceration.

One simple task of a data quality audit is to: A) interview all users. B) statistically profile all files. C) load all data into a data warehouse. D) establish quality metrics.

B) statistically profile all files.

External data sources present problems for data quality because: A) data are not always available. B) there is a lack of control over data quality. C) there are poor data capture controls. D) data are unformatted.

B) there is a lack of control over data quality.

One way to improve the data capture process is to: A) allow all data to be entered manually. B) provide little or no training to data entry operators. C) check entered data immediately for quality against data in the database. D) not use any automatic data entry routines.

C) check entered data immediately for quality against data in the database.

In the ________ approach, one consolidated record is maintained, and all applications draw on that one actual "golden" record. A) persistent B) identity registry C) federated D) integration hub

C) federated

The best place to improve data entry across all applications is: A) in the users. B) in the level of organizational commitment. C) in the database definitions. D) in the data entry operators.

C) in the database definitions.

Data quality is important for all of the following reasons EXCEPT: A) it minimizes project delay. B) it aids in making timely business decisions. C) it provides a stream of profit. D) it helps to expand the customer base.

C) it provides a stream of profit.

Data federation is a technique which: A) creates an integrated database from several separate databases. B) creates a distributed database. C) provides a virtual view of integrated data without actually creating one centralized database. D) provides a real-time update of shared data.

C) provides a virtual view of integrated data without actually creating one centralized database.

An approach to filling a data warehouse that employs bulk rewriting of the target data periodically is called: A) dump mode. B) overwrite mode. C) refresh mode. D) update mode.

C) refresh mode.

Which of the following are key steps in a data quality program? A) Conduct a data quality audit. B) Apply TQM principles and practices. C) Estimate return on investment. D) All of the above.

D) All of the above.

Which type of index is commonly used in data warehousing environments? A) Join index B) Bit-mapped index C) Secondary index D) Both A and B

D) Both A and B

All of the following are popular architectures for Master Data Management EXCEPT: A) Identity Registry. B) Integration Hub. C) Persistent Object. D) Normalization.

D) Normalization.

TQM stands for: A) Thomas Quinn Mann, a famous data quality innovator. B) Total Quality Manipulation. C) Transforming Quality Management. D) Total Quality Management.

D) Total Quality Management.

The process of transforming data from a detailed to a summary level is called: A) extracting. B) updating. C) joining. D) aggregating.

D) aggregating.

Data may be loaded from the staging area into the warehouse by following: A) SQL Commands (Insert/Update). B) special load utilities. C) custom-written routines. D) all of the above.

D) all of the above.

High-quality data are data that are: A) accurate. B) consistent. C) available in a timely fashion. D) all of the above.

D) all of the above.

Loading data into a data warehouse involves: A) appending new rows to the tables in the warehouse. B) updating existing rows with new data. C) purging data that have become obsolete or were incorrectly loaded. D) all of the above.

D) all of the above.

A characteristic of reconciled data that means the data reflect an enterprise-wide view is: A) detailed. B) historical. C) normalized. D) comprehensive.

D) comprehensive.

All of the following are tasks of data cleansing EXCEPT: A) decoding data to make them understandable for data warehousing applications. B) adding time stamps to distinguish values for the same attribute over time. C) generating primary keys for each row of a table. D) creating foreign keys.

D) creating foreign keys.

A technique using artificial intelligence to upgrade the quality of raw data is called: A) dumping. B) data reconciliation. C) completion backwards updates. D) data scrubbing.

D) data scrubbing.

The process of combining data from various sources into a single table or view is called: A) extracting. B) updating. C) selecting. D) joining.

D) joining.

One characteristic of quality data which pertains to the expectation for the time between when data are expected and when they are available for use is: A) currency. B) consistency. C) referential integrity. D) timeliness.

D) timeliness.

A data governance committee is always made up of high-ranking government Officials. T/F

F

A data stewardship program does not help to involve the organization in data quality. T/F

F

Data which arrive via XML and B2B channels is always guaranteed to be accurate. T/F

F

Dirty data saves work for information systems projects. T/F

F

Quality data are not essential for well-run organizations. T/F

F

Quality data does not have to be unique. T/F

F

Retention refers to the amount of data that is not purged periodically from tables. T/F

F

There are six major steps to ETL. T/F

F

Total quality management (TQM) focuses on defect correction rather than defect Prevention. T/F

F

A data quality audit helps an organization understand the extent and nature of data quality problems. T/F

T

A data steward is a person assigned the responsibility of ensuring the organizational applications properly support the organization's enterprise goals for data quality. T/F

T

Completeness means that all data that are needed are present. T/F

T

Data quality is essential for SOX and Basel II compliance. T/F

T

ETL is short for Extract, Transform, Load. T/F

T

The uncontrolled proliferation of spreadsheets, databases and repositories leads to data quality problems. T/F

T


Related study sets

Lab Quiz over Exercise 8 Overview of the Skeletal System

View Set

FAA Private Pilot Exam: Magnetic Compas

View Set

Chapter 4 Construction Types and occupancy classifications

View Set

Chapter 7 - The Physical Environment

View Set

drivers ed class 13 quiz answers

View Set