Data Resource Management Exam 2 Chapter 10

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Which of the following are key steps in a data quality program? -Apply TQM principles and practices. -Conduct a data quality audit. -Estimate return on investment. -All of the above.

All of the above.

The methods to ensure the quality of data across various subject areas are called: -Master Data Management. -Managed Data Management. -Joint Data Management. -Variable Data Management.

Master Data Management.

All of the following are popular architectures for Master Data Management EXCEPT: -Persistent Object. -Integration Hub. -Identity Registry. -Normalization.

Normalization.

The process of transforming data from a detailed to a summary level is called: -updating. -joining. -extracting. -aggregating.

aggregating.

Data may be loaded from the staging area into the warehouse by following: -special load utilities. -custom-written routines. -SQL Commands (Insert/Update). -all of the above.

all of the above.

High-quality data are data that are: -available in a timely fashion. -accurate. -consistent. -all of the above.

all of the above.

Loading data into a data warehouse involves: -purging data that have become obsolete or were incorrectly loaded. -appending new rows to the tables in the warehouse. -updating existing rows with new data. -all of the above.

all of the above.

One way to improve the data capture process is to: -provide little or no training to data entry operators. -check entered data immediately for quality against data in the database. -not use any automatic data entry routines. -allow all data to be entered manually.

check entered data immediately for quality against data in the database.

A characteristic of reconciled data that means the data reflect an enterprise-wide view is: -comprehensive. -historical. -normalized. -detailed.

comprehensive.

All of the following are tasks of data cleansing EXCEPT: -generating primary keys for each row of a table. -creating foreign keys. -adding time stamps to distinguish values for the same attribute over time. -decoding data to make them understandable for data warehousing applications.

creating foreign keys.

Data quality problems can cascade when: -data are not deleted properly. -data are copied from legacy systems. -there are data entry problems. -there is redundant data storage and inconsistent metadata

data are copied from legacy systems.

Conformance means that: -data have been transformed. -data are stored in a way to expedite retrieval. -data are stored, exchanged or presented in a format that is specified by its metadata. -none of the above.

data are stored, exchanged or presented in a format that is specified by its metadata.

The best place to improve data entry across all applications is: -in the database definitions. -in the data entry operators. -in the users. -in the level of organizational commitment.

in the database definitions.

A method of capturing only the changes that have occurred in the source data since the last capture is called ________ extract. -incremental -partial -static -update-driven

incremental

Data quality is important for all of the following reasons EXCEPT:______ -it minimizes project delay. -it helps to expand the customer base. -it aids in making timely business decisions. -it provides a stream of profit.

it provides a stream of profit.

The process of combining data from various sources into a single table or view is called: -joining. -extracting. -updating. -selecting.

joining.

In the ________ approach, one consolidated record is maintained, and all applications draw on that one actual "golden" record. -integration hub -federated -identity registry -persistent

persistent

Data federation is a technique which: Question options: -provides a virtual view of integrated data without actually creating one centralized database. -creates an integrated database from -several separate databases. -creates a distributed database. -provides a real-time update of shared data.

provides a virtual view of integrated data without actually creating one centralized database.

The major advantage of data propagation is: -the ability to have trickle-feeds. -real-time cascading of data changes -throughout the organization. -duplication of non-redundant data. -none of the above.

real-time cascading of data changes throughout the organization.

External data sources present problems for data quality because: -there is a lack of control over data quality. -there are poor data capture controls. -data are unformatted. -data are not always available.

there is a lack of control over data quality.


Ensembles d'études connexes

Business Law- Chapter 3: Administrative Law

View Set

Immune and Hematologic Disorders

View Set

Chapter exam - florida laws and rules

View Set

50 Endocrine Problems (Hypo/Hyperthyroidism) - Lippincotts

View Set

Sets and Venn Diagrams: Assignment

View Set

types of people starting with the letter "A"

View Set

SS Ancient Greeks QUIZ Barno 2022

View Set

Chapter 7 important concepts and key terms

View Set

Chapter 16: Fluid, Electrolyte, and Acid-Base Balance

View Set

Rotations, Reflections, and Translations

View Set

Biology Lesson 2: Chemical Reactions

View Set