CMA Chapter 16

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

Cloud Computing Advantages? What are the primary categories of cloud services?

Cloud computing ("the cloud") is a popular term relating to on-demand access to resources that are accessed on the Internet and shared by others. Advantages of using cloud computing include fast access to software, a reduced need for investment in IT infrastructure, and the ability to use "pay as you go" services. There are three primary categories of cloud services: 1. Infrastructure-as-a-Service (IaaS) 2. Platform-as-a-Service (PaaS) PaaS is a form of cloud computing wherein a third-party provides both hardware and software tools. These tools are typically for application development and you build your own cloud. 3. Software-as-a-Service (SaaS) SaaS is a software distribution model in which customers can subscribe to web-based applications. An example is a cloud storage service like Dropbox, which enables customers to store, share, and synchronize files across devices. Benefits of SaaS include: A. Users do not have to buy, maintain, support, or update as much computer hardware or software. SaaS products are maintained and updated by the third-party host. B. Users can subscribe for only the amount of time the service is needed.

Delphi Approach

The Delphi approach solicits opinions from experts, summarizes the opinions, and feeds the summaries back to the experts (without revealing participants to each other). The process is repeated until the opinions converge on an optimal solution.

What is the final step that is needed before placing a system in live operation?

User acceptance testing is the final step before placing the system in live operation. 1. IT must demonstrate to the users who submitted the original request that the system performs the desired functions. 2. Once the users are satisfied with the new system, they acknowledge formal acceptance and implementation begins.

What-If Analsysis

What-if analysis is a process of determining the effects on outcomes in a model through changes in scenarios. Goal seeking occurs when the decision maker has a specific outcome in mind and needs to determine how it can be achieved.

Four strategies to convert to the new system

1. Parallel Operation, the old and new systems both run at full capacity for a given period. (Safest strategy) 2. Direct Changeover Conversion, the old system is shut down and the new one takes over. (Least expensive method) 3. Pilot Conversion, the new system is converted to one department or division at a time. 4. Phased Conversion, one function of the new system at a time is placed into operation. Ex. AR is installed, than AP, than cash management etc..

Steps of Data Mining

1. The first step in data mining is finding an anomaly, such as an outlier, cluster, unexpected change, or a deviation from what was expected. This involves identifying unusual data records, which might be interesting, or perhaps data errors that could require further investigation. 2. The next step is to find relationships between variables and then cluster (group) those relationships in some manner. Clustering is not always easy because the group structure is not known in advance. 3. Once the data have been clustered, the next step is to generalize the relationships so that the demonstrated structure applies to new data as well as the original database. 4. Regression analysis, including both linear regression and multiple regression, attempts to find a quantitative function or equation that models the data with the least error, that is, to estimate the relationships among the data or datasets. 5. The final step in data mining provides a representation of the data set, including visualizations and reports.

Limitations of Big Data (Including other Data Analytics)

1. User-level data results are incomplete. Typically the data a company has is restricted to only its customers or people who visited their website. The data is only representitive of its target market and customers thus, untapped markets could exits. 2. Providing the answer to why the anlaysis results are what they are is difficult. 3. Data is subject to useless information (noise). A single incorrect or useless variable can corrupt the results. (entering random info in) 4. User-level data results require interpretation prior to use. Generally collected data is in text format and someone needs to format it into visulations. Visulations assist with identifying trends and correlations that might not be caught just from the text data.

Phases of a traditional SDLC

A. Initiation, Feasability and Planning The SDLC begins by recoginzing there is a need for a new system. B. Requirements Analysis What technology will the new system require? How will the new system affect operations? C. System Conceptual Design D. Building and Development E. Testing and Quality Control

AI

AI is computer software that is designe to perceive, reason and understand.. Historically, computer software works through a series of if/then conditions in which every operation has exactly two possible outcomes (yes/no, on/off, true/false, one/zero). AI attempts to imitate human decision making. The advantages of AI in a business environment are that: 1. IT systems can work 24 hours a day 2. Will not become ill, die, or be hired away 3. Are extremely fast processors of data, especially if numerous rules (procedures) must be evaluated.

Big Dta

Any amount of data that has the potential to be mined for information to prdict outcomes in the future. Structured data refers to data that are highly organized into predefined groupings and are typically maintained in relational databases. The data are predefined such that each item falls into a specific anticipated data type (e.g., string, float, integer, date, boolean) that can easily be sorted and searched by computer programs. For example, sales data are mostly structured. Semi-structured data refers to data that are not as highly organized as structured data but still have some identifying information that can be used for organization by computer programs. With certain processes, semi-structured data can be stored in relational databases, which can be handled in the same way as structured data. For example, XML and XBRL data can be converted and stored in relational databases for analysis. Unstructured data refers to information that has little or no predefined organizational structure. This lack of organization makes such data much more difficult for computer programs to search, sort, and analyze. For example, audios, videos, and images are data types that are difficult for computer programs to analyze.

Business Intelligence (BI)

BI uses advanced software to analyze large amounts of data about a business and transforms it into information that is useful for upper management about the present status of the company. and provide reccomendations. It presents the data as graphs, pie charts etc. These displays are sometimes grouped together which is called a digital dashboard.

Blockchain

Background:In the late 15th century, the double-entry system and the general ledger were revolutionary concepts that changed commerce. The invention of the double-entry system allowed third parties to track the finances of a business. The firm maintains a private ledger of its transactions while a trusted independent party (i.e., the auditor) confirms the ledger's accuracy. A blockchain is an innovative technology that has the potential to change accounting. A blockchain is a type of digital database (or ledger) that provides proof of who owns what at any moment in time because each transaction has been added to the ledger. The ledger is a type of distributed ledger that is encrypted, public, and shared widely for anyone to view. Every transaction makes the ledger grow larger. Data is not centralized; therefore, there is no central administrator. The term "blockchain" derives from the nature of a ledger. It contains the history of every transaction and forms a chain. When a change of ownership occurs, a new "block" is linked to the original chain to form a new chain. Cryptocurrency uses a blockchain to track who owns what portion of bitcoins and tracks when ownership changes. A key element of a blockchain is a consensus mechanism. It is a cryptographic process that takes control of the ledger from one party and allows it to be examined and maintained by multiple independent entities. no centralized organization controls the chain. - This is simliar to google docs. Users can edit documents at the same time and the most updated versions are always available. All users must agree to any changes made. In accounting, a company's ledger can be secretly edited and the changes may not be obvious. A blockchain technology would make the editing immediately obvious because the 3rd parties would have a different ledger than you.

What does the IT Steering Committee do?

Because IT pervades every aspect of operations in a modern organization, the IT steering committee must study each request for a new process and either approve or deny it. When a business process has been approved, a project team is created (including a representative of how requested it and the IT personnel) and they will design and build the software.

Data Analytics What are the 5 stages of Data Analytics?

Data Analytics uses qualitative and quantitaive methods to retrieve data out of data sources and analyze it. Where as BI uses software that does it for you. There are 5 stages of Data Analytics: 1. Define Questions Create clearly defined goals 2. Obtain Relevant Data 3. Clean Data 4. Analyze Data Analyze the data to determine if it is the exact data needed. Data analytic methods: Descriptive analysis is the most basic and most used method. It concentrates on reporting actual results. Diagnostic analysis provides insight into the reason certain results occurred. Predictive analysis involves applying assumptions to data and predicting future results. Item 9. below contains a more detailed discussion of predictive analysis. Prescriptive analysis concentrates on what an organization needs to do in order for the predicted future results to actually occur. It tells the company what it needs to do to get where it wants. Anomaly detection is used to identify unusual patterns or deviations from the norm or expected results. Network analysis consists of analyzing network data and statistics to find patterns. Text analysis involves the utilization of text mining and natural language algorithms to find patterns in unstructured text. 5. Communicate Results to appropoiate management.

Key Terminologies

Data Management. Data need to be high-quality and complete before they can be reliably analyzed. Thus, businesses need to 1.Establish processes that build and maintain standards for high-quality data 2. Establish a master data management program Hadoop is an open-source, Java-based software that stores loarge amounts of data and runs applicaitons on clusters of commodity hardware. In-Memory Analystics analyzes data from system memory instead of secondary storage. Predivtive Analystics uses a combination of data, statistical algorithms and machine-learning techqniues (such as expert systems) to identify the probablity of future outcomes based on historical data. Text mining analyzes text data from the Web, comment fields, books, and other text-based sources through the use of machine learning or natural language processing technology. It can be used to identify new topics and term relationships.

Data Mining

Data Mining is the serach for unexpected relationships (anomalies) among data. It combines Information Technology with statistics to analyze data from different perspectives and summarize it into useful information. It involve continous reviews of data and rethinking and may take many trials and errors to find an unexpected relationship between data that can help you. The classic example of the use of data mining was the discovery by convenience stores that diapers and beer often appeared on the same sales transaction in the late evening. Another example is the process Netflix uses to make decisions about which new shows to acquire and produce. Data scientists at Netflix gather information on trends in what and how consumers are watching, then draw conclusions from those trends on what consumers are most likely to watch in the future.

Exploratory Data Analysis (EDA)

EDA is different from traditional methods. EDA encourages the data itself to reveal its nderlying strucutre rather than prematurely applying a different method. It is often the first step in data analysis. The main role of EDA is to open-mindedly explore in order to gain new insight. EDA often uses visual and graphical tools such as histograms and scatter plots.

Limitations of Visulations and Best Practices for Visulation Tools

Limiations: 1. Visulations can be misleding. The persons who creats an image can influence the viewer's interpreations of that image. 2. Visulation tools can be manipulated to present a biased pricture. Best Practices: 1. Utilize the full axis by ensuring the axis begins with zero and does not skip values. 2. Do not overload the design with too much information. 3. Limit the amount of colors used to ensure differences among data points stand out. Avoid using variant shades of the same color (e.g., use red and not light red, soft red, medium red, etc.). 4. Ensure the design does not require the user to interpret relationships among data 5. Ask other for feedback on the design before finilzation.

Program Change Control

Over the life of an application, users are constantly asking for changes. The process of managing these changes is referred to as systems maintenance, and the relevant controls are called program change controls. In order to make a change to the new system. The programmer should have a test system called "a sandbox" in which they can write new code and test it. Once approved and user accepted, it is put into the live system. * Only under emergencies should the direct live system be updated without first being tested and going through multiple phases.

STUDY THE VISULATION CHARTS

PAGE 665

Regression Analysis: Simple regression Multiple Regression

Regression analysis is the process of deriving a linear equation that describes the relationship between two variables.Simple regression is used for one independent variable. Multiple regression is used for more than one. Simple: Y=a+bx y=dep var a= the y int b=the slope of regression line x = the ind var R2 (coefficient of determination)= the coefficient of correlation squred. it measures the fit between the ind var and the dep var. Ex. If a car dealership determines car sales are a function of income with a coefficient of correlation of .8. Than R2=..64 which means 64% of new care sales can be explained by changes in income.

Sensitivity Analysis

Sensitivity Analysis Sensitivity analysis uses trial-and-error to determine the effects of changes in variables or assumptions on final results. It is useful in deciding whether expending additional resources to obtain better forecasts is justified. The trial-and-error method inherent in sensitivity analysis is greatly facilitated by the use of computer software.A major use of sensitivity analysis is in capital budgeting. Small changes in interest rates or payoffs can make a significant difference in the profitability of a project. Sensitivity analysis is limited due toThe use of assumptions instead of factsThe consideration of variables individually as opposed to all together

Simulation Monte Carlo Simulation

Simulation: This method is a sophisticated refinement of probability theory and sensitivity analysis. The computer is used to generate many examples of results based on various assumptions. Project simulation is frequently expensive. Unless a project is exceptionally large and expensive, full-scale simulation is usually not worthwhile. Monte Carlo Simulation: This method often is used in simulation to generate the individual values for a random variable. The performance of a quantitative model under uncertainty may be investigated by randomly selecting values for each of the variables in the model (based on the probability distribution of each variable) and then calculating the value of the solution. Performing this process many times produces the distribution of results from the model.

Smart Contracts

Smart contracts are computerized transaction protocols that execute the terms of a contract. A smart contract is a collection of code and data that is deployed using cryptographically signed transactions on a blockchain network. The general objectives of smart contract design are to satisfy common contractual conditions (such as payment terms, liens, etc), minimize exceptions (both malicious and accidental), and minimize the need for trusted intermediaries (e.g., lawyers).

What are the various methods used to test systems? What are the 4 levels of tests?

Static Testing: Examines the program's code and documentation through reviews, walkthroughs and inspections but does not require the program to be executed. Dynamic Testing: Involves executing programmed code with a given set of tests White-box Testing: Tests the internal components of a program as opposed to the functionality exposed to the end-user. Black-box Testing: Focuses on how well the software works. It examines the functionality without any knowledge of the code. (as if it were a black box) Gray-Box Testing: Goes further than black box testing. A gray-box evaluation involves having knowledge of the program's internal components in order to design tests while simultaneously performing these tests at the user or blackbox level. Four levels of Tests: 1. Unit Testing: Refers to the tests that verify: A. The functionality of a specific code B. The means of handling data that are passed between multiple units. 2. Integration Testing: Integration testing works to expose errors in the interfaces between components. 3. System Testing: Or end-to-end testing, tests a completely integrated system to verify that the system meets its requirements. 4. Acceptance Testing: Is conducted to determine whether the system meets the organization's needs and is ready for release.

5 Steps of the System Development lifecycle (SDLC)

The SDLC approach is the traditionial method for applying to the development of large systems. A major advantage of the life-cycle approach is the enhanced management and control of the development process. SDLC consists of 5 steps: Systems strategy requires understanding the organization's needs. Project initiation is the process by which systems proposals are assessed. In-house development is generally chosen for unique information needs. Commercial packages are generally chosen for common needs rather than developing a new system from scratch. Maintenance and support involves ensuring the system accommodates changing user needs. Portions of the phases can overlap. Note that the feedback gathered during the maintenance of a system provides information for developing the next generation of systems, hence the name life cycle.

Aspects of Regression Analysis

The linear relationship established for x and y is valid only across the relevant range, the range from the highest to the smallest measures in the data set. The user must identify the relevant range and ensure that projections lie within it. Regression analysis assumes that past relationships are a basis for valid projections. Regression does not determine causality.Although x and y move together, the apparent relationship may be caused by some other factor. For example, car wash sales volume and sunny weather are strongly correlated, but car wash sales do not cause sunny weather. The goodness-of-fit test assists with determining whether the sample is representative of the population (validates assumptions).The confidence level is the percentage of times that a sample is expected to be representative of the population; i.e., a confidence level of 95% should result in representative samples 95% of the time. Advantages of regression analysis include the following: It uses data efficiently.Good results can be obtained with relatively small sets of data.The theory associated with linear regression is well-understood and allows for construction of different types of easily-interpretable statistical intervals for predictions, calibrations, and optimizations. The disadvantages or limitations of regression analysis include the following: Outputs of regression can lie outside the relevant range. The shapes that linear models can assume over long ranges are limited. Sometimes the extrapolation properties will be poor. The regression line may be ultrasensitive to outliers.

Standard Error

The standard error measures how well the linear equation represents the data. It is the vertical distance between the data points in a scatter diagram and the regression line. The closer the data points to the regression line, the lower the standard error.

Types of AI

There are several types of AI: Neural networks are a collection of processing elements working together to process information much like the human brain, including learning from previous situations and generalizing concepts. Case-based reasoning systems use a process similar to that used by humans to learn from previous, similar experiences. Rule-based expert systems function on the basis of set rules to arrive at an answer. These cannot be changed by the system itself. They must be changed by an outside source (i.e., the computer programmer). Intelligent agents are programs that apply a built-in or learned knowledge base to execute a specific, repetitive, and predictable task, for example, showing a computer user how to perform a task or searching websites for particular financial information. An expert system is an interactive system that attempts to imitate the reasoning of a human expert in a given field. It is useful for addressing unstructured problems when there is a local shortage of human experts.

Business Process Reengineering

This involves a complete rethinking of how business functions are performed in order to provide value to customers. Think of radical innovation instead of mere improvements.

Robotic Process Automation (RPA)

This is a form of machine learning technology that enables a computer to acquire knowledge and mimic the actions of the people using it to perform a task. It can be used to takeover repetitive tasks.

Time Series Analysis

Time series analysis (also called trend analysis) is the process of projecting future trends based on past experience. It is a regression model in which the independent variable is time. A seasonal pattern often exists when a time series is influenced by seasonal factors A cyclical pattern exists when data show rises and falls that are not of a fixed or seasonal pattern. The duration of these cyclical fluctuations is usually a couple of years. Cyclical behavior is often confused with seasonal behavior. However, if the fluctuations are not of a fixed period, they are cyclical; if the period is unchanging and associated with some aspect of the calendar, the pattern is seasonal. Limitations of Time Series Analysis 1. Components or elements that play a role in projections may become less reliable over time. 2. The conclusions resulting from the analysis could be misleading. For example, higher sales of snow jackets could be the result of an increase in population or could be increasing at a decreasing rate. 3. Predictability of applicable components or elements in the past are assumed to remain unchanged for the future.

Big Data 4 V's

Volume. Describes the large amount of data captured over time Variety. Velocity. Refers to the speed at which big data must be analyzed. Veracity. Refers to the trsustworthiness of data "The 5th V" more and more businesses are using for big data is Value.


संबंधित स्टडी सेट्स

Introduction to Cybersecurity Final Exam

View Set

NUR234 Quiz on messed up babies. need answers. Erickson

View Set

MATERNAL NEWBORN COMPLETE STUDY SET

View Set

US History (American Pageant) Unit 7

View Set

Missed quiz questions after course completion

View Set