ITD 256 Chapter 10 Self Check

Lakukan tugas rumah & ujian kamu dengan baik sekarang menggunakan Quizwiz!

F

"T/F - A data stewardship program does not help to involve the organization in data quality.

F

"T/F Data transformation is not an important part of the data reconciliation process.

Aggregation

"The process of transforming data from a detailed to a summary level is called ________.

data steward

A ________ is a person assigned the responsibility of ensuring that organizational applications properly support the organization's enterprise goals of data quality.

static

A method of capturing data in a snapshot at a point in time is called ________ extract

data scrubbing

A technique using pattern recognition to upgrade the quality of raw data is called ________.

field-level

A(n) ________ function converts data from a given format in a source record to a different format in a target record.

audit

A(n) ________ will thoroughly review all process controls on data entry and maintenance.

Update Mode

An approach in which only changes in the source data are written to the data warehouse is called ________.

value; value

Completeness means that all data that must have a ________ does have a ________.

meta data

Conformance refers to whether the data is stored, exchanged or presented in a format that is as specified by its ________.

data transformation

Converting data from the format of its source to the format of its destination is called ________.

real time

Data propagation duplicates data across databases, usually with no ________ delay.

data capture processes

Improving ________ is a fundamental step in data quality improvement.

persistent

In the ________ approach, one consolidated record is maintained from which all applications draw data.

Modeling

Sound data ________ is a central ingredient of a data quality program.

F

T/F - A data governance committee is always made up of high-ranking government officials.

T

T/F - A data quality audit helps an organization understand the extent and nature of data quality problems.

T

T/F - A data steward is a person assigned the responsibility of ensuring the organizational applications

F

T/F After the extract, transform, and load is done on data, the data warehouse is never fully normalized.

T

T/F Bit-mapped indexing is often used in a data warehouse environment.

T

T/F Completeness means that all data that are needed are present.

T

T/F Data quality is essential for SOX and Basel II compliance.

T

T/F Data reconciliation occurs in two stages, an initial load and subsequent updates.

T

T/F Data scrubbing is a technique using pattern recognition and other artificial intelligence techniques to upgrade the quality of raw data before transforming and moving the data to the data warehouse.

F

T/F Data which arrive via XML and B2B channels is always guaranteed to be accurate.

F

T/F Dirty data saves work for information systems projects.

F

T/F Generally, records in a customer file never become obsolete.

T

T/F Joining is often complicated by problems such as errors in source data.

T

T/F Loading data into the warehouse typically means appending new rows to tables in the warehouse as well as updating existing rows with new data

F

T/F Master data management is the disciplines, technologies and methods to ensure the currency, meaning and quality of data within one subject area.

T

T/F One of the biggest challenges of the extraction process is managing changes in the source system.

F

T/F Quality data are not essential for well-run organizations

F

T/F Quality data does not have to be unique.

T

T/F Refresh mode is an approach to filling the data warehouse that employs bulk rewriting of the target data at periodic intervals.

F

T/F Retention refers to the amount of data that is not purged periodically from tables.

F

T/F Static extract is a method of capturing only the changes that have occurred in the source data since the last capture.

T

T/F The data reconciliation process is responsible for transforming operational data to reconciled data

T

T/F The major advantage of the data propagation approach to data integration is the near real-time cascading of data changes throughout the organization.

F

T/F The process of transforming data from detailed to summary levels is called normalization

T

T/F The uncontrolled proliferation of spreadsheets, databases and repositories leads to data quality problems.

F

T/F There are six major steps to ETL.

F

T/F Total quality management (TQM) focuses on defect correction rather than defect prevention.

F

T/F Update mode is used to create a data warehouse.

T

T/F User interaction integration is achieved by creating fewer user interfaces

T

T/f ETL is short for Extract, Tranform, Load.

Selection

The process of partitioning data according to predefined criteria is called ________.

user interfaces

User interaction integration is achieved by creating fewer ________ that feed different systems.

Dirty Data

________ can cause delays and extra work on information systems projects.

Application integration

________ is achieved by coordinating the flow of event information between business applications.

Data federation

________ provides a virtual view of integrated data without actually bringing the data into one physical database.

F

t/f - Data are moved to the staging area before extraction takes place

F

t/f Data federation consolidates all data into one database.


Set pelajaran terkait

NCLEX book CHAPTER 63- Neurological Medications

View Set

CSI 4321: TCP/IP Notes Chapter 5

View Set

Database Normalization & Implementation

View Set

Taylor ch 7 PrepU - Legal Dimensions

View Set

Resistance Training Equipment & Safety & Types of Exercises

View Set

Econ connect quiz questions -->April 6, 2018

View Set