Acct 503 Exam 1 Study Guide

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

what are the three non-financial-perspectives

1. Customer. This objective defines how the organization should appear to its customers if it is to accomplish its vision. 2. Internal business process. This objective specifies the processes the organization must excel at to satisfy its shareholders and customers. 3. Learning and growth. This objective indicates how an organization can improve its ability to change and improve in order to achieve its vision.

what are the five broad evolutionary categories of computerised decision support

1. Decision Support systems 2. Enterprise Information systems 3. Business Intelligence 4. Analytics 5. Big Data

what are the three types of business reporting and what do each report

1. Metric Management report- business performance is managed through outcome-oriented metrics. For external groups, these are service-level agreements. For internal management, they are key performance indicators (KPIs). Typically, there are enterprise-wide agreed targets to be tracked against over a period of time. 2. dashboard-type report- different performance indicators on one page, like a dashboard in a car 3. balanced scorecard-type reports- a method developed by Kaplan and Norton that attempts to present an integrated view of success in an organization. In addition to financial performance, balanced scorecard-type reports also include customer, business process, and learning and growth perspectives

what are the 10 differing types of KPI's? what does each represent

1. Strategy. KPIs embody a strategic objective. 2. Targets. KPIs measure performance against specific targets. Targets are defined in strategy, planning, or budget sessions and can take different forms (e.g., achievement targets, reduction targets, absolute targets). 3. Ranges. Targets have performance ranges (e.g., above, on, or below target). 4. Encodings. Ranges are encoded in software, enabling the visual display of performance (e.g., green, yellow, red). Encodings can be based on percentages or more complex rules. 5. Time frames. Targets are assigned time frames by which they must be accomplished. A time frame is often divided into smaller intervals to provide performance mileposts. 6. Benchmarks. Targets are measured against a baseline or benchmark. The previous year's results often serve as a benchmark, but arbitrary numbers or external benchmarks may also be used. 7. Customer performance. Metrics for customer satisfaction, speed and accuracy of issue resolution, and customer retention. 8. Service performance. Metrics for service-call resolution rates, service renewal rates, service level agreements, delivery performance, and return rates. 9. Sales operations. New pipeline accounts, sales meetings secured, conversion of inquiries to leads, and average call closure time. 10. Sales plan/forecast. Metrics for price-to-purchase accuracy, purchase order-to-fulfillment ratio, quantity earned, forecast-to-plan ratio, and total closed contracts.

what are the four bsc perspectives

1. customer 2. financial 3. internal business processes 4. learning and growth

what are the three elements of developing regression models

1. data assesment- scatter plot correlation, 2. model fitting- transform data and estimate parameters, and 3. model assesment- test assumptions and assess model fit

what are the four steps of data preprocessing

1. data consolidation 2. data cleaning 3. data transformation 4. data reduction

understand data preprocessing tasks and potentional methods

1. data consolidation- access and collect data, select and filter data, integrate and unify the data 2. data cleaning- handle missing values in the data, identify and reduce noise in the data, find and eliminate erroneous data 3. data transformation- normalize the data, discretize or aggregate the data, construct new attributes 4. data reduction- reduce number of attributes, reduce number of records, balance skewed data

what are the 11 different components of the analytics ecosystem

1. data generation infrastructure providers 2. analytics focused software providers 3. data service providers 4. middleware providers 5. data warehouse providers 6. data management infrastacture providers 7. application developers: industry specific or general 8. academic institutions or certification agencies 9. analytics industry analysts and influencers 10. Regulators and Policy makers 11. Analytics user organization

what are the 10 characteristics that define the readiness levels of data

1. data source reliability 2. data content accuracy 3. data accessibility 4. data security and privacy 5. data richness 6. data consistency 7. data currency/timeliness 8. data granularity 9.data validity 10. data relevancy

how are each defined

1. data source reliability- the originality and appropriateness of the storage medium where the data is obtained—answering the question of "Do we have the right confidence and belief in this data source?" 2. data content accuracy- means that data are correct and are a good match for the analytics problem—answering the question of "Do we have the right data for the job?" 3. data accessibility- means that the data are easily and readily obtainable—answering the question of "Can we easily get to the data when we need to?" 4. data security and privacy- means that the data is secured to only allow those people who have the authority and the need to access it and to prevent anyone else from reaching it. 5. data richness- means that all the required data elements are included in the data set. In essence, richness (or comprehensiveness) means that the available variables portray a rich enough dimensionality of the underlying subject matter for an accurate and worthy analytics study 6. data consistency- means that the data are accurately collected and combined/merged 7. data currency/timeliness- means that the data should be up-to-date (or as recent/new as it needs to be) for a given analytics model 8. data granularity- requires that the variables and data values be defined at the lowest (or as low as required) level of detail for the intended use of the data 9. data validity- a term used to describe a match/mismatch between the actual and expected data values of a given variable 10. data relevancy - means that the variables in the data set are all relevant to the study being conducted

what are the 6 major components of the data warehousing process

1. data sources 2. data extraction and transformation 3. data loading 4. comprehensive database 5. metadata 6. middleware tools

what are the 5 alternative data warehousing architectures

1. independent data marts 2. data mart bus architecture 3. hub and spoke architecture 4. centralized data warehouse 5. federated data warehouse

how do each alternative approach data warehousing

1. independent data marts- The DMs are developed to operate independent of each other to serve the needs of individual organizational units 2. data mart bus architecture- the individual marts are linked to each other via some kind of middleware. Because the data are linked among the individual marts, there is a better chance of maintaining data consistency across the enterprise 3. hub and spoke architecture- Here the attention is focused on building a scalable and maintainable infrastructure (often developed in an iterative way, subject area by subject area) that includes a centralized data warehouse and several dependent DMs (each for an organizational unit 4. centralized data warehouse- a gigantic EDW that serves the needs of all organizational units. This centralized approach provides users with access to all data in the data warehouse instead of limiting them to DMs. In addition, it reduces the amount of data the technical team has to transfer or change, therefore simplifying data management and administration 5. federated data warehouse- a concession to the natural forces that undermine the best plans for developing a perfect system. It uses all possible means to integrate analytical resources from multiple sources to meet changing needs or business conditions. Essentially, the federated approach involves integrating disparate systems

what are the five assupmtions to linear regression

1. linearity- This assumption states that the relationship between the response variable and the explanatory variables are linear 2. independence- (of errors). This assumption states that the errors of the response variable are uncorrelated with each other. 3. normality- (of errors). This assumption states that the errors of the response variable are normally distributed. 4. constant variance- (of errors). This assumption, also called homoscedasticity, states that the response variables have the same variance in their error, regardless of the values of the explanatory variables. 5. multicollinearity- this assumption states that the explanatory variables are not correlated (i.e., do not replicate the same but provide a different perspective of the information needed for the model).

what are the 5 commonly used OLAP operations

1. slice 2. dice 3. Drill down/up 4. roll up 5. pivot

what are the four major components to BI

A BI system has four major components: 1. a DW, with its source data; 2. business analytics, a collection of tools for manipulating, mining, and analyzing the data in the DW; 3. BPM(Business Performance Management) for monitoring and analyzing performance; 4. and a user interface (e.g., a dashboard).

business report

Any communication artifact prepared with the specific intention of conveying information in a digestible form to whoever needs it, whenever and wherever they may need it.

what is Big Data Analytics

Big Data is data that cannot be stored in a single storage unit. Big Data typically refers to data that comes in many different forms: structured, unstructured, in a stream, and so forth. There are two aspects to managing data on this scale: storing and processing. Storing this data in chunks on different machines connected by a network—putting a copy or two of this chunk in different locations on the network, both logically and physically (hadoop Distributed File System [hdfs]). Push computation to the data, instead of pushing data to a computing node (Hadoop MapReduce).

what are the two main categories of structured data and the two sub categories below them

Categorical or numeric. The categorical data can be subdivided into nominal or ordinal data (Examples of categorical variables include race, sex, age group, and educational level), whereas numeric data can be subdivided into intervals or ratios (For example, the variable marital status can be generally categorized as (1) single, (2) married, and (3) divorced.).Nominal data can be represented with binomial values having two possible values (e.g., yes/no, true/false, good/bad), or multinomial values having three or more possible values (e.g., brown/green/blue, white/black/Latino/Asian, single/married/divorced).

what is correlation versus regression

Correlation makes no a priori assumption of whether one variable is dependent on the other(s) and is not concerned with the relationship between variables; instead it gives an estimate on the degree of association between the variables. On the other hand, regression attempts to describe the dependence of a response variable on one (or more) explanatory variables where it implicitly assumes that there is a one-way causal effect from the explanatory variable(s) to the response variable, regardless of whether the path of effect is direct or indirect. Also, although correlation is interested in the low-level relationships between two variables, regression is concerned with the relationships between all explanatory variables and the response variable

how does data mangement change based on data technology related changes?

Data technology evolved from domain experts manually executing and documenting interviews and surveys to rule based expert system, to ERP (enterprise resource planning) to dashboards and visualizations, to data wharehousing for on time report to finally big data, changing the way data is managed

what are the two primary questions of each category

Descriptive - 1. what happened 2. what is happening: Predictive - 1. what will happen 2. why will it happen: Prescriptive - 1. What should I do 2. why should I do it

how do ERP's play into computerized decision support? Are they continued to be valuable

ERP - Enterprise Resource Planning. With ERP, all the data from every corner of the enterprise is collected and integrated into a consistent schema so that every part of the organization has access to the single version of the truth when and where needed.

What is the EEE Approach

Exposurer Experience Exploration. Method of teaching which the book uses to describe analytics

what is simple versus multiple regression

If the regression equation is built between one response variable and one explanatory variable, then it is called simple regression. For instance, the regression equation built to predict/explain the relationship between a height of a person (explanatory variable) and the weight of a person (response variable) is a good example of simple regression. Multiple regression is the extension of simple regression where the explanatory variables are more than one. For instance, in the previous example, if we were to include not only the height of the person but also other personal characteristics (e.g., BMI, gender, ethnicity) to predict the weight of a person, then we would be performing multiple regression analysis

what is a balance as it relates to BSC

In BSC, the term balance arises because the combined set of measures is supposed to encompass indicators that are Financial and nonfinancial, Leading and lagging, Internal and external, Quantitative and qualitative, and Short term and long term

what are the differences between an OLTP and an OLAP

OLTP (online transaction processing system) is a term used for a transaction system that is primarily responsible for capturing and storing data related to day-to-day business functions such as ERP, CRM, SCM, POS, and so forth. An OLTP system addresses a critical business need, automating daily business transactions, and running real-time reports and routine analysis. But these systems are not designed for ad hoc analysis and complex queries that deal with a number of data items. OLAP, on the other hand, is designed to address this need by providing ad hoc analysis of organizational data much more effectively and efficiently. OLAP and OLTP rely heavily on each other: OLAP uses the data captured by OLTP, and OLTP automates the business processes that are managed by decisions supported by OLAP

what are the roles of each

The outer six petals can be broadly termed as the technology providers. Their primary revenue comes from providing technology, solutions, and training to analytics user organizations so they can employ these technologies in the most effective and efficient manner. The inner petals can be generally defined as the analytics accelerators. The accelerators work with both technology providers and users. Finally, the core of the ecosystem comprises the analytics user organizations. This is the most important component, as every analytics industry cluster is driven by the user organizations

what is the difference between transaction processing and analytic processing

Transaction processing are systems that handle a company's routine ongoing business. In contrast, a DW is typically a distinct system that provides storage for data that will be used for analysis. The intent of that analysis is to give management the ability to scour data for information about the business, and it can be used to provide tactical or operational decision support.

what is the DMAIC performance model

a closed-loop business improvement model, and it encompasses the steps of defining, measuring, analyzing, improving, and controlling a process. The steps can be described as follows: 1. Define. Define the goals, objectives, and boundaries of the improvement activity. At the top level, the goals are the strategic objectives of the company. At lower levels—department or project levels—the goals are focused on specific operational processes. 2. Measure. Measure the existing system. Establish quantitative measures that will yield statistically valid data. The data can be used to monitor progress toward the goals defined in the previous step. 3. Analyze. Analyze the system to identify ways to eliminate the gap between the current performance of the system or process and the desired goal. 4. Improve. Initiate actions to eliminate the gap by finding ways to do things better, cheaper, or faster. Use project management and other planning tools to implement the new approach. 5. Control. Institutionalize the improved system by modifying compensation and incentive systems, policies, procedures, manufacturing resource planning, budgets, operation instructions, or other management systems.

what is Six Sigma

a performance management methodology aimed at reducing the number of defects in a business process to as close to zero defects per million opportunities (DPMO) as possible. Six Sigma provides the means to measure and monitor key processes related to a company's profitability and to accelerate improvement in overall business performance. Because of its focus on business processes, Six Sigma also provides a straightforward way to address performance problems after they are identified or detected.

what does enterprise application integration mean

a vehicle for pushing data from source systems into the data warehouse. It involves integrating application functionality and is focused on sharing functionality (rather than data) across systems, thereby enabling flexibility and reuse.

what does enterprise information integration mean

an evolving tool space that promises real-time data integration from a variety of sources, such as relational databases, Web services, and multidimensional databases. It is a mechanism for pulling data from source systems to satisfy a request for information

what are the main elements of changing busines environments and evolving needs for decision support and analytics

big data, accurate data, to make the right decisions faster

what is a BPM? What is a perfmance measurement system as it relates to BPM's

business performance management (BPM) refers to the business processes, methodologies, metrics, and technologies used by enterprises to measure, monitor, and manage business performance. three key components (Colbert, 2009): 1. A set of integrated, closed-loop management and analytic processes (supported by technology) that addresses financial as well as operational activities 2. Tools for businesses to define strategic goals and then measure and manage performance against those goals 3. A core set of processes, including financial and operational planning, consolidation and reporting, modeling, analysis, and monitoring of key performance indicators (KPIs), linked to organizational strategy

what are the three broad sources of data in which busisness analytics comes from

business processes, internet/social media, and machines/ internet of things

what are the 6 elements of the data taxonomy

categorical- nominal or ordinal, and numeric- interval or ratio.

what does it mean to have data "analytics ready"

data has to have a certain data structure in place with key fields/variables with properly normalized values. Furthermore, there must be an organization-wide agreed-on definition for common variables and subject matters (sometimes also called master data management), such as how you define a customer (what characteristics of customers are used to produce a holistic enough representation to analytics) and where in the business process the customer-related information is captured, validated, stored, and updated

what has to match related to data and its usability

data has to match with (have the coverage of the specifics for) the task for which it is intended to be used. Even for a specific task, the relevant data on hand needs to comply with the quality and quantity requirements. Essentially, data has to be analytics ready

what are the three categories of business analytics

descriptive, predictive, and prescriptive

what is the ETL process

extraction (i.e., reading data from one or more databases), transformation (i.e., converting the extracted data from its previous form into the form in which it needs to be so that it can be placed into a data warehouse or simply another database), and load (i.e., putting the data into the data warehouse).

how does group communication and collaboration change based on BI, Analytics, and Data Science

group communication and collaboration can occur even if members may be in different locations. Similarly the data that is managed can be stored in house or outside and new systems make it easy to search, store, and transmit data quickly, cheaply and securely. with more data and analysis technologies, more alternatives can be evaluated, forecast may be improved risk analysis can be performed quickly and the views of experts can be collected quickly and at low cost.

what is real-time BI

instant, on-demand access to dispersed information to close the gap between the operational data and strategic objectives

what is an OLAP

online analytics processing a database infrastructure that was always online and contained all the information from the OLTP systems, including historical data, but reorganized and structured in such a way that it was fast and efficient for querying, analysis, and decision support.

what is OLTP

online transaction processing (OLTP) systems handle a company's routine ongoing business.

what are the sub categories for unstructured data

textual, multimedia, and xml/json

what is the summarized difference between BSC & Six Sigma

the main difference is that BSC is focused on improving overall strategy, whereas Six Sigma is focused on improving processes.

what is dimensiional reduction or variable selection as it relates to data preparation

to reduce the number number of variables down to a manageable size. reduce the dimensions in the data into a more manageable and most relevant subset.


Kaugnay na mga set ng pag-aaral

Interpersonal Communications Ch. 10

View Set

Cognitive Psychology- Ch. 5 Practice Questions

View Set

Unit 9: Quiz 2: Attempts at World Peace

View Set

Foundations Exam 1 Prep U Review (pt 2)

View Set

Industrial Revolution, Slave Trade and Convicts: The trade of human life

View Set

Bio 1A Part 1: Practice Questions + Explanations

View Set