Corp App Final

¡Supera tus tareas y exámenes ahora con Quizwiz!

Enterprise Resource Planning (ERP)

- A suite of applications called modules, a database, and a set of inherent processes for consolidating business operations into a single, consistent, computing platform -Enterprise - whole company, integrated, unified common -Resource - more broad than cash and inventory -Planning - lay foundation for not only evaluating past operations but helping to guide decision for future decisions

ORDER-TO-CASH process and their steps

--Processing an order: 1.Enter a sales order 2.Book the sales order 3.Pick release the sales order 4.Ship confirm the sales order 5.Close the sales order --Processing Cost of Goods Sold: 1.Defer cost of goods 2.Recognize cost of goods 3.Create journal entries --Processing an invoice: 1.Import invoice 2.Print 3.Revenue Recognition Create journal entries --Processing cash receipts: 1.Enter a cash receipt 2.Apply the cash receipt 3.Create journal entries

On-Line Analytical Processing (OLAP)

-A set of graphical tools that provides users with multidimensional views of their data and allows analysis of data using simple windowing techniques -Data is viewed in cubes that have axes, dimensions, measures, slices, and levels 1) Cube refers to -Underlying semantic structure that is used to interpret data -A particular materialization of data in such a semantic structure 2) OLAP Operations -Cube slicing - come up with 2-D view of data -Drill-down - going from summary to more detailed views

The Focus and Goal of NoSQL

-high scalability -high availability -eventual consistency -Goal is to accommodate huge transaction rates and fast selects.

Three tier architecture

-A special type of client/server architecture consisting of three well-defined and separate processes, each running on a different platform: 1. The user interface (GUI), which runs on the user's computer (the client). 2. The functional modules that actually process data. This middle tier runs on a server and is often called the application server. 3. A database management system (DBMS) that stores the data required by the middle tier. This tier runs on a second server called the database server. -Client tier handles the GUI - handles the Windows Client, the web browsers, the Office integrations/add ins, and other applications. -Application tier contains the business logic of the ERP - so it executes the code. -Database tier manages the SQL Server database.

Data Warehouse

-A subject-oriented, integrated, time-variant, non-updatable collection of data used in support of management decision-making processes -Not necessarily a physical location -Data warehousing is very expensive

Controls focused on during an Application Audit

-Access Controls -Authorization Controls -Input controls -Processing controls -Output controls -Other controls

List of Enterprise Data Assets

-Application databases/data stores -Traditional data warehouses and data marts -Newer big data environments -Mission critical spreadsheets and personal databases -Interim data stores and extracted files that facilitate data moving across organization

Application Level Audit Trail

-Application works with the database to record information about a transaction as the application interacts with the database. -Sys admin can enable at table level, and creates detailed records on changes, additions, deletions at the database level. As records are changed, a database trigger replicates the changes to a mirror table (so trail of who, what and when). -Audit trail is stored in a different table than the production data(i.e. the mirrored table of all the changes).

Traditional Enterprise Data Environment

-Applications are designed around functions/departments (e.g., sales, purchasing, inventory and finance) and NOT business processes (e.g., purchase to pay, order to cash) -Functions are separate and evolve independently in their own system -Functions may be manual, use a single application or many applications -Data-sharing via interfaces where data is typically summarized (e.g., total or balances only) -Cross-function processes involved delays, additional costs, lack of detail, data redundancy and the need for reconciliation -Business controls were heavily dependent on manual processes

Internal controls questionnaire (ICQ)

-Assists enterprises with defining the scope of the assurance engagement and can be used during the exploration period of an audit. -A Starting point -Typically involves identifying if the control is in place or not (evaluation comes later)

External Auditing

-Audit trail is built through mining of information external to the database, e.g. mining information in redo logs and network traffic -Redo logs are used by the database to allow for rolling back of transactions and to allow for recovery of the database in event of a hardware or software failure. -The redo logs are a record of all changes in the database, therefore, are the ideal audit trail as every insert, update and delete to the database are captured.

Qualities of ERP Processing

-Authorizations can occur at the application, not just the database level -Significant rise in users who have access - increased access in field personnel, vendors and customers -Requires well thought out definition of security access capabilities -System (application), network and database access security are all required -Transactions stored in common database(s) -Modules configured to automatically create entries in the database for each other -Databases can be accessed by any module -System modules (applications) are transparent to users -High dependent on system based controls -Traditional batch controls and audit trails are not available -Need data validation before acceptance of data

Data Classification Challenges

-Awareness and importance of classification are lacking -Classification schemes/terms are overly complicated -Classification schemes are subjective, and often outdated and unrealistic -Global workforce adds complexity -Roles and responsibilities are unclear -Data classification has different meanings within the organization (ex. security might classify by sensitivity while a manager might think it means classifying by data/format type)

Transaction Authentication and Integrity (Application Control Objective 6)

-Before passing transaction data between internal applications and business or operational functions (in or outside the enterprise), check for proper addressing, authenticity of origin and integrity of content. -Maintain authenticity and integrity during transmission or transport. Controls -Exchange of transaction data follows communication standards for authentication and security -Transaction output tagged for identification and verification -Transaction input analyzed to determine authenticity of origin and content integrity

Implementation Strategies

-Big Bang - all at once -Franchising Strategy - location install -Slam Dunk - install for key processes -Parallel - old and new at same time -Phased - process/location stepped turnover -SaaS - cloud-based, pay as you need

NoSQL Database and Their Qualities

-Broad class of database management systems designed to handle very large datasets, very quickly -Do not use SQL as their primary query language -Do not have fixed schema -Do not use join operations -May not have ACID (atomicity, consistency, isolation, durability) -Scale horizontally -Referred to as "Structured storages" -Some NoSQL systems are entirely non-relational, others simply avoid selected relational functionality such as fixed table schemas and join operations. (Example: a NoSQL database might organize data into objects, key/value pairs or tuples (rows) instead of tables)

ERP System Characteristics

-Business process focused -Modular, integrated software program/platform that serves the entire organization -Relational DBMS allows information to be shared across modules -Codified business management practice

Fact Tables

-Centralized table that contain the measures (facts) of interest in the data mart/warehouse -Consists of measures and foreign keys -Measures may be raw facts or aggregations -Typically numeric and additive -Contents vary with the granularity of the data mart/warehouse

Object Oriented Programming Terminology

-Class= Blueprint or prototype from which objects are created called instances -Object - a specific instance of a class Objects have data, behavior (methods), and identity (properties) -Property - an attribute to which you assign a value -Method/Function - an action an object can perform (i.e., instructions for some type of action) -Parameters/Arguments - variables passed into/out of a method -Variable - container for a single data value. -Array - container for a collection of data (usually of the same type) -Declare (verb) - created an instance of something by giving it a name and defining it's type -Assign (verb)- give a value to

Input Controls for Coding (input validation)

-Conditional Logic (Decision) §If...Then -RegEx (i.e., regular expressions) -Error Handling -User Prompt (response to bad input

Application Controls

-Controls that apply to the processing of specific computer applications and are part of the computer programs used in the accounting system. -Seek to provide reasonable assurance of achievement of management's objectives relative to a given application. -Found in the specific functional requirements for the solution, business rules for information processing and supporting manual procedures. Ex) Input, Output, and Processing Controls

The Goals to reach for your Enterprise Data

-Creating a single, all encompassing information assets environments that embrace all types of data, including big data, which gets all its data from critical applications and data sources inside and external to organization -Transitioning to a process-driven organization, data-driven decision-making, fewer barriers between functional departments (instead of application driven)

Deploying Humans & Technology for data classification

-Data classification requires the integration of both manual processes involving employees as well as tools for automation and enforcement. -Human intervention provides context -Technology tools enable efficiency/policy enforcement -Automated data classification tools: -Find and identify sensitive data --Generally look for data that can be matched deterministically (e.g., credit card numbers, SSN) --Could use fuzzy logic, syntactic analysis, and other techniques for less-structured information --Security tools parse data to identify sensitive data that matches patterns or policies -Labels are attached to information once identified

Source Data Collection and Entry (Application Control Objective 2)

-Data input is performed in a timely manner by authorized and qualified staff -Correction/resubmission of data performed without compromising original transactions -Original source documents are retained as required Controls; -Pre-numbered source documents (i.e., identify missing, out of sequence values) -Identified personnel (i.e., input, edit, authorize, accept, reject, override) -Error messages at origin, logged -Timely review, follow-up, correction of errors -Source document retention standards/safety

Dimensional Models

-Data model design for data warehouses A variant of the relational model Made up of tables with attributes -Relationships defined by keys and foreign keys -Organized for understandability and ease of reporting rather than update -Queried and maintained by SQL or special purpose management tools. - Consists of "Fact" and "Dimension" tables

ERP Application Audit (Scope)

-Design of controls -Evaluation of control effectiveness -Compliance with regulatory requirements -Identification of issues requiring management attention

Map Reduce

-Distributed file structure that allows massive parallel computing (Not a database -> not NoSQL) -Focus is on performing data operations on parallel hardware (vertical scalability) -Spread processing across many servers, with each server performing an independent process -Used to process big Data Manipulation §Input §Splitting §Mapping §Shuffling/Grouping §Reducing

Document Stores

-Document databases pair each key with a complex data structure known as a document. -Store and retrieve documents similar to an electronic filing cabinet. -Documents often organized using meta-data and tags. -Document stores bridge the gap between key value and relational databases. -Useful for content management of user activity streams -Most common with JSON (JavaScript Object Notation) data, adopted most widely -Semi structured, considered by some as new XML

ERP Implementation

-ERP implementation is very complex -Application Modules are similar to "building blocks" §Processing is done in many locations §Modules can be linked together to form end-to-end processing §Modules are configurable to suit business needs (e.g., transaction codes, reports, audit trails, monitoring features, etc.) -ERP may have hundreds of elements to configure across multiple modules Example: Oracle ERP Application Suite has 2300 Applications, 2500 tables, 3800 reports, and 11000 data dictionary items

Source Data Accuracy, Completeness and Authenticity Checks (Application Control Objective 3)

-Ensure that transactions are accurate, complete and valid -Input data as close to the point of origination as possible Controls; -Transaction data verified at origin, interactive -Accuracy controls -Sequence, limit, range, validity, reasonableness, table look-ups, existence, key verification, check digit, completeness, duplicate, time/timing -Identified personnel, SOD -Input, modify, authorize data and transactions -Identify and report failed transactions -Monitoring and correction of failed transactions

Characteristics of Enterprise Data

-Enterprise Data is Messy -Inconsistent data/Fragmented data landscape -What leads to data inconsistencies? multiple data stores inconsistent data content/data definitions conflicting business rules diverse, non-integrated technologies timing issues adhoc data sharing (no single data sharing interface) shadow IT (e.g., unsanctioned devices/applications)

ETL Process

-Extract (capture) -Scrub/Data cleaning -Transform -Load and Index

Data Classification Standards

-FIPS 199 - Standards for Security Categorization of Federal Information and Information Systems -ISO 27001 - A.7.2.1 of the standard mandates the creation and maintenance of data classification guidelines -NIST standard 800 800-53 - Security and Privacy Controls for Federal Information Systems and Organizations 800-60 Volume I - Volume I: Guide for Mapping Types of Information and Information Systems to Security Categories 800-60 Volume II - Volume II: Appendices to Guide for Mapping Types of Information and Information Systems to Security Categories

Change Management in an ERP

-Focus on controls over updates and changes in ERP in production. -Can be a major weak points in controls when not handled effectively. -Auditors should ask: Is there a formal change-request process, with documented, authorized policies and related control forms and approvals? §Are all change requests and related activity logged for tracking purposes? §Does management authorize all changes with informed consent? §Does security administration to follow up on changes to permissions immediately? -Best Practices (Things to look for): §Documentation of detailed testing of all changes, including a test plan, results, and related approvals to proceed. §Change management policy should include a definition of levels of change. §Minor changes reasonably might not require the same level of testing and documentation as major changes §No change is "too small" to follow the fully documented test process. §Complete backups are performed before implementation §Back-out plan is developed as a normal aspect of major change. §User involvement and notification about major changes prior to implementation

Graph Store

-Graph stores are used to store information about networks of data, such as social connections. -Differs from relational DB model in that complex relationships and direct linkages are allowed -Used less often than other models -Use of this model is driven by type of data it can handle rather than volume of data. -Each node knows adjacent nodes; cost of one hop remains the same regardless of how many nodes are connected -Optimized for "connections" -Directly stores relationship vs connecting via PK/FK

Session Hijacking (Script Injection)

-Hacker overwrites a sub/function return pointer (which tells the computer where to return once the function/sub is completed) -Attacker can set the value to point to an address of his/her choosing -In a session hijacking attack, the hacker steals the user's session token and uses it to access the user's account. There are several ways that an attacker can stage a session hijacking attack, such as inflicting the user's device with a malware that monitors and steals session data. Another method is the use of cross-site scripting attacks, in which an attacker uploads a programming script into a webpage that the user frequently visits and forces the user's computer to send the session cookie data to the server.

Application level Logical Security

-Independent of, but in accordance with organizational logical security -Procedural controls to test for: 1) Documented user ID administration process 2) Use of access profiles 3) Documented application security configuration -Limit number of allowable log-on attempts -Password expiration and re-use ability -Minimum password length -Password complexity 4) Security monitoring procedures 5) Processes for reporting security violations

Data Mining and Machine Learning

-Machine learning (ML) is a type of artificial intelligence (AI) that allows software applications to become more accurate at predicting outcomes without being explicitly programmed to do so. Machine learning algorithms use historical data(usually a lot)as input to predict new output values. -Data mining is a process used by companies to turn raw data into useful information. By using software to look for patterns in large batches of data, businesses can learn more about their customers to develop more effective marketing strategies, increase sales and decrease costs.

Processing Integrity and Validity (Application Control Objective 4)

-Maintain the integrity and validity of data throughout the processing cycle -Detection of erroneous transactions should not disrupt the processing of valid transactions Controls; -Authorization to initiate transactions -Processing is complete and accurate using automated controls -Sequence, duplication, record counts, referential integrity, control totals, range checks -Transactions have unique and sequential identifier -Time, date, user identification -Process changes reviewed immediately

ERP Modules

-Module is a term used to describe a separate unit of software "building blocks". -Processing is done in many locations -Modules can be linked together to form end-to-end processing -ERP Modules work on the same ideas as application logic -The goal with ERP modules is to make the functionality of the module portable so that it can be a variety of systems, and to make it interoperable, meaning it can interact with other modules of other systems. -ERP ICQs often note setup/configuration review §Modules are configurable to suit business needs (e.g., transaction codes, reports, audit trails, monitoring features, etc.) §ERP may have hundreds of elements to configure across multiple modules

ACID vs BASE princples

-No one-for-one mapping between the two consistency models. -BASE properties are much looser than ACID guarantees Values availability but it doesn't offer guaranteed consistency of replicated data at write time. Data will be consistent in the future, either at read time or it will always be consistent, but only for certain processed past snapshots

Audit Concerns with not using ACID principles and using BASE instead

-Not ACID compliant leads to Sample Criticisms -Data is less secure against errors and loss -Data is not immediately persisted i.e., data that stays around after the application has terminated -Controls -Audit Trail -"Precision" of data -Relations can be slow and can get messy

Understanding Code (Basics)

-Objects, Variables, Methods/Functions, etc. must be declared before they can be used -Some functions/subs are created by the class, others can be created by the programmer -Variables are empty (NULL) until a value is assigned to them -Each block of code can usually (depending on scope) only use variables declared within the block -You can pass data values between blocks of code using parameters -A function can return a value, but the value must be stored somewhere on return.

Output Review, Reconciliation and Error Handling (Application Control Objective 5)

-Procedures and associated responsibilities ensure that the necessary control information is provided and used to enable verification, detection and correction of the accuracy of output. Controls; -Physical inventory kept of sensitive output -Control total matching -Validation of completeness before subsequent transactions -Review of final output by business owners -Audit trail of recipients of sensitive output and control/possession

Reconciliation Process

-Receive bank statement -Load and verify bank statement -Perform Reconciliation (i.e. compare to the books) -Review results -Create journal entries and post to general ledger (ex) -Enter the detailed bank information manually or electronically. -Reconcile this information with your system transactions manually or automatically. -While reconciling, you can create miscellaneous transactions for bank-originated entries and manually enter payments and receipts. -You can manually or automatically clear payments, receipts, open interface and miscellaneous transactions before reconciliation to maintain current cash account balances. -After reconciliation, review the Bank Statement Detail, Bank Statement Summary, and Bank Statement by Number Reports. -You can post the accounting entries to your general ledger.You can reconcile your general ledger cash account balance to your bank account balance in Cash Management by printing the GL Reconciliation Report

Two tier architecture

-Refers to client/server architectures in which the user interface runs on the client and the database is stored on the server. -Most of the application-especially the user interface-runs on the desktop (the client). -The primary function of servers is to access databases. -The function of the client machines is to host the application user interface and conduct a preliminary validation of the data, before it is submitted to the database server for processing and storage. -The actual application logic can run on either the client or the server.

Traditional Operational Databases Characteristics

-Relational DBMS -Organized Structure -ACID -SQL standardized

Types of Databases

-Relational Databases- predefined tables (rows(tuples) and columns(attributes)) -Network= No parent child constraints -Hierarchical = Parent child constraints -Object Oriented

Authorization

-Rules that specify who can access what as well as the privileges with that access (Least Privilege principle) (Read/Write/Execute) -Access Control Lists (tables) Lists users, groups of users, machines, apps Type of accesses allowed How maintained: new, change, remove

Role Based Security

-Security-concept that permissions are not granted directly to users but to security roles. •Role- group of duties required for a job function, •Duty- group of privileges to perform a task, •Privilege-permissions for individual objects •Permission-basic access restriction to data and features (e.g. tables, fields)

Key-Value Stores

-Simplest NoSQL databases that has no schema . -Every single item in the database is stored as an attribute name (or "key"), together with its value. -Large, distributed hashmap data structures that store and retrieve values organized by key identifiers. -Data is not related or at least minimally -You hash the key and find the value

Applications

-Software that runs on the operating system providing various types of functionality -Different from operating software in that applications focus on a specific functionality (ex. payroll, email, order entry, word processing)

Source Data Preparation and Authorization (Application Control Objective 1)

-Source documents are prepared by authorized and qualified personnel -Data entry follows established procedures, SOD -Good input form design -Errors and irregularities detected, reported, corrected Controls; -Completeness controls -Predetermined input codes -Default values -Unique and sequential identifiers -Log submitted and returned documents -Identification of document preparer

Column Stores

-Store columns of data together, instead of organizing in terms of rows. -Row-oriented view and column-oriented view (flipped 90 degrees) -Each row gives a complete view of one entity -Column view indexes particular aspects across the whole dataset. -Keys and columns with some values -no common schema (schema lite) -Columns can vary and columns may not exist for all rows (so do not think tables) -Useful for massive amounts of data spread across distributed systems, e.g. Google, Facebook -Think of it like storing each field according to an index; except the location is found based upon the data, instead of the other way around

Application Architecture and the three layers

-Supporting infrastructure that enables the execution and use of software applications 1) Presentation layer - provides the user interfaces and the look and feel of the application, receives inputs, provides outputs (GUI) 2) Business layer - applies business logic to user inputs and outputs. 3) Data layer - manages storage of application related and user data, typically in a database

Sources of information to help understand an Application

-Systems development documentation -Functional design specifications -Program change documentation -User manuals -Technical documentation

Dimension Tables

-Table associated with a fact table, provides additional information about the facts -Contain text or numeric attributes -Used to analyze and group measures in fact tables -Uses surrogate (generated) keys, not operational keys

Database Event Auditing

-Tracks activity at the event level (e.g. user logging onto application, Key SQL statements data access) -Since auditing at the database level, there is no application user session information captured. -Database Event Auditing is most useful in capturing accesses to the database and changes in the structure and privileges, rather than auditing changes in individual tables. -Database event auditing enables you to capture changes to the database that do not come through the application (so users who log into the database instead). -Examples of database events might include error instances, login, log off, startup, shutdown. A log of these events captures the who and when, but not the what was changed. (at database level, i.e. has nothing to do with the application)

The change of data analytics

-Traditional data analysis focused on past events and data. -Today, the creation of more data and new tools has switched that focus onto predicting future events (model and predict)

Types of Application Architectures

-Two tier architecture (Client and Server Model) -Three tier architecture (Client, middle and Server Model)

Considerations about Data when performing Data Discovery & Classification

-Type - Confidentiality of the data -Data integrity - low-quality data cannot be trusted -Data availability - requires resilient storage/networking environment

Database Trigger Auditing

-Unlike Application Level Audit trail, database trigger auditing uses database triggers to create audit trail. -Database level auditing can audit some transactions that the application cannot since it captures changes to the database through logins other than the application. -Like database event auditing, database trigger enables you to capture changes to the database that do not come through the application but differs in terms of the activities to which they related. -Database triggers are code someone has written that are set to run just before or just after an INSERT, UPDATE or DELETE SQL event occurs on a particular database table. -The user session gets stored in a table. Auditing the triggers means you are looking at these stored data about the session.

The Use and Roles of a data warehouse

-Use a data warehouse to reduce hindrance on the processing of the day to day operations of the database (the data center) -Store extracts from operational data and make those extracts available to users in a useful format -Can improve the integrity of data as it is being cleansed and then reused in operations for decision making

Extensible Markup Language(XML)

-Used as a data interchange format -Defines content and content structure -Allows the creation of custom tags to describe data -Used for documents containing structured information

ERP Audits

-Walkthroughs of business process help clarify interactions with sub-processes and supporting modules -Assess the risks and then the controls to mitigate those risks -ERP "review" is different from the "classic" approach (e.g., standalone systems) 1. Integrated process vs specific program controls/boundaries 2. More difficult to turn this type of objective into a clear-cut audit program. -Auditors need to understand ERP Process Flow: i.e. interactions and flow of information

Application Risks

-Weak security -Unauthorized access to data -Unauthorized remote access -Inaccurate information -Erroneous or Falsified Data Input -Misuse by Authorized End Users -Incomplete processing -Duplicate transactions -Untimely processing -Communications system failure -Improper output delivery -Output disclosure -Inadequate training -Inadequate support

Key Questions when Classifying Data

-Where does your sensitive and confidential data exist in the firm's infrastructure? -How can you categorize what is sensitive? -Who are the stakeholders in your organization to help identify sensitive data and develop the data security strategy?

Input Risks

-Writing exploitable code. -Using unsafe string function calls -Assuming input data will be "friendly" and what is expected (and not controlling for the alternative) -Application programming language can impact potential for attack.(SQL injection, session hijacking, buffer overflow, cross-sight scripting) -Attacks often exploit software/hardware vulnerabilities

Access Controls

-authentication (id users) -authorization (grant privileges)

Types of User Support

1) Adequate end user support -Manuals -Online help -Subject matter experts 2) Training for end users -Understand application requirements to determine training needs -Initial/ongoing training -New and changed functionality 3) Help desk -Support staff should be capable of addressing the needs of users -Logs, knowledge base, escalation, respond, sign-off, document

COBIT Guidance areas to Application (4)

1) Application Control and Auditability -Implement business controls, where appropriate, into automated application controls such that processing is accurate, complete, timely, authorized and auditable. 2) Application Security and Availability -Address application security and availability requirements in response to identified risks and in line with the organization's data classification, information architecture, information security architecture and risk tolerance. 3) Configuration and Implementation of Acquired Application Software -Configure and implement acquired application software to meet business objectives. 4) Application Software Maintenance -Develop a strategy and plan for the maintenance of software applications.

ACID principle (Structured)

1) Atomic: All or nothing. 2) Consistency: Consistent state of data and transactions even after transaction. 3) Isolation: Transactions are isolated from each other. 4) Durability: When the transaction is committed, state will be durable (best used with operational data)

BASE principles

1) Basically available: Nodes in the a distributed environment can go down, but the whole system shouldn't be affected. 2) Soft State: The state of the system and data changes over time. 3) Eventual Consistency: Given enough time, data will be consistent across the distributed system.

Areas of Application Security Controls Testing

1) Black box testing= also known as Behavioral Testing, is a software testing method in which the internal structure/design/implementation of the item being tested is not known to the tester. -Communication behavior - format and sequence, impersonation -Fault-injection points - manual or batch -Client behavior -Interaction with other applications -File manipulation/modification -Cryptanalysis of sensitive data -Application "fuzzing" 2) Related systems testing -Application source code review -Operating system -Database management system

ERP Risks (2)

1) Business Risk •occurring at the wrong time or sequence, •occurring without proper authorization, •involving the wrong internal agent, •involving the wrong external agent, •involving the wrong resource, •involving the wrong amount of resource, and/or •occurring at the wrong location. 2) Information Risk §Recording risks - include recording incomplete, inaccurate, or invalid data about a business event. Incomplete data results in not recording all of the relevant characteristics about a business event in the data stores. Inaccurate data arises from recording data that does not accurately represent the event. Invalid data refers to data that are recorded about a fabricated event. §Maintaining risks - Maintaining risks are similar to recording risks. Instead of referring to the risks of recording event data, maintaining risks refer to the risks of initially recording, then keeping current, the data stored about organizational resources, agents, and locations. §Reporting risks - Reporting risks involve data outputs. They include distributing data that are inaccurate, improperly classified, improperly summarized, provided to unauthorized parties, and/or not provided in a timely manner.

Centralized vs. Decentralized Enterprise data

1) Centralized systems -Data maintained in central locations -Data shared with and accessed by various departments/functions up and down process -More functionality (planning) -Pre-integrated modules 2) Decentralized enterprise data -Each task handled by department with own application -One database per application -Data maintained within department -No access outside department

General Objectives for Application Processing (5)

1) Completeness—The application processes all transactions, and the resulting information is complete. 2) Accuracy—All transactions are processed accurately and as intended, and the resulting information is accurate. 3) Validity—Only valid transactions are processed, and the resulting information is valid. 4) Authorization—Only appropriately authorized transactions have been processed. 5) Separation (aka Segregation) of duties—The application provides for and supports appropriate segregation of duties and responsibilities as defined by management.

Two Components of XML

1) Document Type Declaration (DTD) - describes the content of XML document 2) Document data (XML report)

Solutions to Enterprise Data

1) Enterprise application integrations - interfaces among applications themselves (turns decentralized enterprise data into centralized enterprise data) 2) Enterprise information integration 3) Data warehouses- leave applications along and instead bring data into separate data warehouse database (data marts) 4) ERP (Note: ERP is only one piece of the landscape - there is no single "fix")

Authentication Considerations

1) Factors / Multi-factor Something you know Something you have Something you are / Biometrics 2) Login IDs Simple naming Default account IDs ID sessions, timing, single sign-on 3) Passwords/Passphrase (Policy) Length Complexity Expiration, reuse

Examples of Input Controls

1) General Input Controls §Authorization §Batch control totals ($, items, docs, hash) §Screen / form design §Pre-printed input data §Transaction logs §Transmittal logs §Cancellation of source documents/data §User training 2) Input Validation (Application) Controls -Do not trust user inputs -Sanitize/constrain user inputs by: Rejecting known bad data/characters. Accepting only valid data. Cleaning bad data. -Encrypt or mask user inputs -Boundary checks and input manipulation controls SQL injection, buffer overflows

The process of Understanding an Application (9)

1) Identify purpose of the application and operating environment 2) Identify business owners and relevant execs 3) Determine source of application, support 4) Determine system infrastructure 5) Identify regulatory requirements for application 6) Identify interfaces to financial reporting systems 7) Identify interfaces to other business processing systems 8) Identify critical performance factors, key metrics 9) Perform system walk-though

Input, Process, Output

1) Input - Information supplied to the computer program -Examples: Lines of Code, User input (via keyboard, voice, mouse, touch, etc.), magnetic strips, RFID devices, Databases and files, etc. -Most programs are event driven - code is written to respond to events (file upload, submit button, form open, mouseover, etc.) 2) Process - programming logic (i.e., instructions or "algorithms" that occur in response to input) 3) Output - information provided by the computer program Examples: Screen, File, Paper, Database, Inputs to other systems

Recommendations/Suggestions for Data Classification

1) Keep labels simple. 2) Recognize two types of data classification projects: new data and legacy data. 3) Identify roles and responsibilities for data classification. 4) Start small. (a department/ segment at a time) 5) Document your business requirements for protecting data and current initiatives

Three Phases of Map Reduce

1) Map - get data from a large number of nodes 2) Shuffle (Group) - Convert a list into sorted list data; aggregated on a few number of nodes 3) Reduce - Convert the new list into a small number of atomic values via a reduce operator

Order-to-Cash Process (6)

1) Market goods or services and/or provide customer support •Record, maintain and report data that facilitates and documents marketing efforts. •Record and report data to facilitate and document providing quotes or bids for customers. •Record, maintain and report data to facilitate and document customer support efforts. 2) Negotiate sales order with customer and check credit •Record, maintain, and report data that facilitates and documents sales order negotiation. 3) Fulfill Order activities (including Select and Inspect goods or services to be delivered; Prepare goods or services for delivery; Deliver goods or services, Close sales order, prepare for sales invoice) -Record, maintain and report data that facilitates and documents fulfillment - (picking in the warehouse, packing, and shipping - the point of sale). -Record, maintain and report data about invoicing (billing) the customer for the sale (and financial measurement of that sale) 4)Prepare and send invoices to customers, Update COGS 5) Receive payment for goods or services and prepare bank reconciliations •Record, maintain and report data about receipt of payment (and financial measurement of that receipt). 6) Accept customer returns of goods •Record, maintain and report data to facilitate processing of return requests, return authorizations, receiving returns and issuing credits or refunds (and financial measurement of returns/refunds).

What type of data should be protected

1) Mission critical Data (on average 70% of the value of a public company is contributed to their critical data) 2) Toxic Data -PII - Personally Identifiable Information -PHI- Protected Health Information -PCI - Payment Card Information -IP- Intellectual Property 3) High Risk Data Any information that can, if lost, lead to: -Significant contractual/legal liabilities -Serious damage to organization image/reputation -Legal/financial/business losses (i.e. PII, employee info, Insurance info)

Procure-to-Pay Process (5)

1) Monitor resource levels and, if applicable, internally request goods or services (based on valid identification of need). ·Record, maintain and report data about inventory levels, and needs of products or services. ·Record, maintain and report data to facilitate and document internal requests for goods or services. 2) Negotiate the order of goods or services with a vendor. (send an RFQ to vendor, receive a quote back, turn into PO) ·Record, maintain and report data to facilitate and document placing purchase orders with vendors for goods or services. 3) Receive and inspect goods or services (then store and/or maintain goods). -Record, maintain and report data to facilitate and document the receipt of goods or services ordered (the point of purchase) and the storage/ maintenance of goods. -Record, maintain and report data about the liability that arises due to the procurement of goods or services (and financial measurement of that liability 4) Pay for goods or services. ·Record, maintain and report data to facilitate and document payment for goods and services purchased (and financial measurement of that payment). 5) Return goods (or request refund for services received). ·Record, maintain and report data to facilitate and record activities associated with returning goods and receiving refunds or liability reductions (and financial measurement of the return and refund or amount due adjustment).

Two components of Graph Stores

1) Nodes - each node represents an entity (a person, place, thing, category or other piece of data) 2) Relationships - each relationship represents how two nodes are associated

Data Classification plays a role in...

1) Risk Management/Security -Example: Category of data determines encryption, anonymous, internal storage vs cloud storage 2) Compliance and legal discovery -Meeting legal requirements for retrieving data within a set time frame 3) Reducing duplicate information -Cuts storage/Backup Costs -Speeds up data searches

Data Storage approaches for Relational (1) and NoSQL databases (4)

1) SQL(ACID principles) -Relational Systems that store and query tuples (rows) in relations (tables) 2) NoSQL (BASE principles) -Column Store Systems that store data as columns of data rather than as rows of data -Document Store Systems that store documents, providing index and simple query mechanisms -Key-value Systems that store values and an index to find them, based on a key -Graph Systems that store model data as graphs where nodes can represent content modeled as document or key-value structures and arcs represent a relation between the data modeled by the node. (Note - only graph databases fully capitalize on data relationships.)

3 Components of Algorithms

1) Sequence (also known as Process) - steps in the algorithm executed in a specified order (usually top down) 2) Decision (also known as Selection) - block of code that tests a condition and provides an outcome If...Else Switch Case 3) Repetition (also known as Iteration or Looping) - block of code that is repeated either a certain number of times or until a condition is met Do...While/Until For...Next

Application Control Objectives (6)

1) Source Data Preparation and Authorization 2) Source Data Collection and Entry 3) Source Data Accuracy, Completeness and Authenticity Checks 4) Processing Integrity and Validity 5) Output Review, Reconciliation and Error Handling 6) Transaction Authentication and Integrity

Big Data Types (3)

1) Structured - traditional database model is based on structure that assumes data that will go into tables, so data is structured to fit that model 2) No structure - can be text or non text. Snap chats, tweets, mail messages, PowerPoint presentations, Word documents, text messages, JPEG images, MP3 audio files video files. 3) Semi structured - has information associated with it, such as metadata and tags. e.g. JSON (JavaScript Object Notation), data represented as graphs

2 Types of High-Risk Data Environments

1) Structured environments = Data stored in Ordered databases, most likely linked through a business application (i.e. Oracle DB) 2) Unstructured environments = Data saved outside of an arranged database, such as data stored on individual employee devices or thumb drives (i.e. excel files, word docs, text files)

Sources of Big Data (4)

1) Structured sources 2) Unstructured sources 3) Internal/external 4) Data not previously captured, stored, or analyzed This data includes information garnered from social media, data from internet-enabled devices (including smartphones and tablets), machine data, video and voice recordings, and the continued preservation and logging of structured and unstructured data.

Key areas to focus on when Managing Enterprise Data

1) Technology -Data content (structure, semi and unstructured) -Data storage -Software tools -Programming languages 2) Processes - policies and procedures to manage technologies -Testing and quality assurance policies -Policies related to data retention -Techniques to govern data exchanges -Issues resolution policies 3) People - data users and technology/process managers -Data ownership and stewardship -Roles and responsibilities -Recruiting/Promotion of data/tech managers -Manage skills and training of people

Data Analytics Process

1) Understanding the data you have and the data you need to develop or acquire 2) Streamline, organize and integrate the right data 3) Building trustworthy data through good governance, security, and data cleansing processes 4) Build analytical models to evaluate the data. 5) Analyze the results of the model to find where value can be added or new insight is revealed

Big Data Characteristics (the 5 V's and 1 C)

1) Volume - More data- The amount of data being created is vast compared to traditional data sources. More of our own data (archive, junk, log files), added free or public data, premium service data. 2) Velocity - Speed at which new data arrives. Data is being generated extremely fast —now more streaming data projects which allows the potential of near real time analysis. 3) Variety - Differences in types/sources of data - (structured, unstructured, and semi-structured). 4) Veracity - Differences in data accuracy/quality. Big data is sourced from many different places; as a result, you need to test the veracity and quality of the data the truth and the quality of the data, the usefulness of the data 5) Variability - differences in flows of data 6) Complexity - dealing with simultaneous sources, types, speeds, timing

ERP Processing Implications for Code

1)Layered Environment -Code stored at the OS level, database level, and at the middle tier. -A change to any of the code (also referred to as objects) should be subject to a company's change management plan, regardless how small or large the change. 2)Code changes also include the change of some of the setups within the application that can be done through the application (user interface/forms). Many setups have the effect of changing code because they cause the application to process/react differently based on the different configurations. 3)A sample list of 'system' changes that would need to have an audit trail would include: •Changes to the database structure •Addition, deletion, or change to database triggers •Changes to programs, libraries, or scripts at the OS level •Changes to objects or packages at the database level •Changes to the setups or profile options at the application level

How the Database landscape is changing

1)Traditional DB storage -Physical decisions - on premise or hosted -Logical options - (relational) DBMS with OLTP and OLAP (using data warehouse) 2) Updated DB storage landscape -Physical decisions - on premise, cloud, or hybrid -Logical options - (relational) DBMS, NoSQL, Hadoop, File system -"blob" actually stands for "Binary Large Object" and is used for storing information in databases -A blob is a data type that can store binary data

Data Warehousing Process

1. Data is extracted on a periodic basis from source systems, such as ERP systems, and moved to a dedicated server that contains a data warehouse. 2. Then it is cleaned, formatted, validated, reorganized, summarized, and supplemented with data from many other sources which is then stored in the data warehouse(ETL). 3. Then the data warehouse is used as a source of information for report generation, analysis, and outputs/reports (ad-hoc queries, canned reports, and dashboards). (We are not changing the data in the data ware house we are only adding more and more data)

Factors for a Data Warehouse to be a Success (5)

1. High-level, executive support (buy-in) 2. Commitment of resources 3. Identified business value 4. Enterprise vision 5. Change management processes

The Process of Protecting Data

1. Know and Define Data (data discovery) 2.Dissect/Analyze Data (data classification) 3.Defend Data

5 ways to create an ERP Audit Trail

1.Standard Application Auditing 2.Application Level Audit Trail 3.Database Event Auditing 4.Database Trigger Auditing 5.External Auditing Not an all or nothing approach- layered approach instead:

Buffer (Cache) Overflow Process

1.User enters name. Computer stores name. 2.Using entered name the computer looks up the encrypted password in a table, and stores the password for comparison. 3.User enters a LONG password. Computer stores unencrypted password, •the long password extends past the password boundary so it overflows into the next stored word which happens to be the encrypted password from the table. •The user cleverly crafts the overflowed characters to match the encryption of the entered password. 4.Computer encrypts entered password, and stores encrypted password. 5.Computer compares the encrypted password from the table (now overwritten) with the encryption of the password the user entered. If they match (which they are guaranteed to do so in the attack), allow entry. -Two errors needed for attack to succeed: 1.The encrypted password is fetched from the table before the user enters a password. 2.The encrypted password from the table is stored adjacent to the entered password.

Normal login process

1.User enters name. Computer stores name. 2.Using entered name the computer looks up the encrypted password in a table, and stores the password for comparison. 3.User enters password. Computer stores unencrypted password. 4.Computer encrypts entered password, and stores encrypted password. 5.Computer compares the encrypted password from the table with the encryption of the password the user entered. If they match, allow entry.

3-Tier Example

1.When a process is sent from a client, the job is directed to the process server with the lowest current processing load. 2.This frees the client sending the task, and helps spread the overall processing load to these specialized clients. 3.This also helps achieve optimal performance for a database server such as SQL Server.

2-Tier Example

1.When you post the transaction, the client's logic performs a few validations, and the data is submitted to the database server where the true processing occurs. 2.The database server hosts most of the business logic to process the posting operation and store the data in the various tables affected by the posting operation. 3.The database server then submits the information required to generate your posting reports, per the request of the client application. Your reports are then printed and you are done.

Computer program

A set of detailed directions telling the computer exactly what to do, one step at a time. -A type of algorithm= procedure or formula for solving a problem, based on conducting a series of specified actions.

Cross-Site Scripting (XSS)

An attack that injects scripts into a Web application server to direct attacks at clients.

SQL injection attack

An attack that targets SQL servers by injecting commands to be manipulated by the database.

Traditional Operational Databases Weaknesses

Impedance mismatch Scalability issues Clusters Cost

Costs and Benefits of an ERP System

Benefits -Information sharing -Interacting processes -Improved processes -Improved access to data -Standardization and comprehensiveness -Flexibility §Example: Real-time access and batch processing of data -Modularity -Efficiency §Example: year-end closing in 2 days vs 2/3weeks -Best business practice Costs -Change in way of doing business -Cost, $500K - $300M, 1-3 years -Consulting -Training -Data conversion/analysis -Loss of employees

Scrubbing Process of ETL contains

Cleansing process that uses pattern recognition and AI techniques to upgrade data quality Fixing errors: misspellings, erroneous dates, incorrect field usage, mismatched addresses, missing data, duplicate data, inconsistencies Other activities: decoding, reformatting, time stamping, conversion, key generation, merging, error detection/logging, locating missing data

Traditional Operational Databases Strengths

Consistency Persistence Concurrency Standardization Application Integration

General Controls

Controls present in the environment surrounding the information systems -Policies -Procedures -Infrastructure/support -Identity/access management -Physical controls -Environment controls -Disaster Recover/Business Continuity -Change management -User support/training

Transformation process of ETL

Convert data from the format of the operational system to format of the data warehouse -Record-level activities: Selection - data partitioning Joining - data combining Aggregation - data summarization -Field-level: single-field - from one field to one field multi-field - from many fields to one, or one field to many Data partitioning- splitting out fields into multiple fields Denormalization- opposite of data partitioning, so combining two fields together

Order-to-Cash Flow of Documents

Customer Request(RFQ) Send Sales Quote Sales Order Distribution -Picking Slip -Bill of Lading -Packing Slip Invoicing -Sales Invoice Customer Payment -Remittance -Customer's Statement Returns -Return Merchandise Authorization RMA

Input Data Validation Controls

Edit Checks= validation logic residing in either database or application code EX) -Completeness check -Validity Check -Field Check -Limit Check -Valid Sign -Reasonableness Check -Default value -Closed loop verification -Duplicate Check -Sequence Check

Data Classification Schemes

ISACA: "Establish a classification scheme that applies throughout the enterprise, based on the criticality and sensitivity (e.g., public, confidential, top secret) of enterprise data. This scheme should include details about data ownership; definition of appropriate security levels and protection controls; and a brief description of data retention and destruction requirements, criticality and sensitivity. It should be used as the basis for applying controls such as access controls, archiving or encryption." -Classification schemes are expensive -Classification scheme needs vary by organization/industry -Best practices suggest using simple/straightforward classification scheme -Drivers for classification include both value and risk based drivers for classifying data

By Value

Objects passed to a method "by value" meaning that the current value of the actual parameter is copied to the formal parameter in the method header (i.e. doesn't change the value of the object)

Types of Extract (capture) Processes (2)

Obtaining a snapshot of a chosen subset of the source data for loading into the data warehouse 1) Static Extract= capturing a snapshot of the source data at a point in time 2) Incremental Extract= capturing changes that have occurred since the last static extract

Application License Management

Organizational processes to ensure management, control, and protection of application assets utilized within an organization EX) -Application "inventory management" - know what you have (helps avoid penalty payments) -Application license compliance - pay for what you have (rent vs. buy) -Application costs - pay for what you need -Application version management- maintenance, security, consistency -Ensure appropriate user training - learn about what you use

Enterprise Data Architecture

Organized, methodical collection of technologies and policies to manage data across organization, regardless of where it resides

Parameters vs Arguments

Parameters are the rules that specify what values can be used in the function Arguments are the actual values that are passed during the call of a function

Load Process of ETL

Placing transformed data into the warehouse and creating indexes Refresh mode: bulk rewriting of target data at periodic intervals Update mode: only changes in source data are written to data warehouse

Pros and Cons of Database Event Auditing

Pros 1.Part of standard database technology 2.Since this type of auditing is built-in functionality, it is easy to configure and there is minimal performance impact. Cons 1.Those with sufficient understanding of, and access to, the audit tables can alter the audit trail 2.Application user session information is not captured 3.The audit trail can be easily manipulated by the DBA.

Pros and Cons of Database Trigger Auditing

Pros 1.Can provide meaningful auditing of low-volume high-risk for database sessions activity on tables such as core application set ups and application security tables 2.Part of standard application/database technology Cons 1.Many companies and DBAs have not enabled application auditing and are unfamiliar with its' deployment. As such, strict best practice project methodology needs to be followed. 2.Auditing high volume transactional tables is impractical, if not impossible, due to the impact on performance caused by the database triggers. 3.Those with sufficient understanding of, and access to, the audit tables can alter the audit trail

Pros and Cons of Application Level Audit Trail

Pros 1.Can provide meaningful auditing of low-volume, high-risk tables such as core application set ups and application security tables (like those contained in the Application Object Library) 2.Part of standard application/database technology 3.Can provide much more detailed audit trail than Standard Application Auditing because all updates, inserts, and deletions can be recorded Cons 1.Many companies and DBAs have not enabled Application Level Auditing and are unfamiliar with its' deployment 2.Auditing high volume transactional tables is impractical, if not impossible, due to the impact on performance caused by the database triggers 3.Those with sufficient understanding of, and access to, the audit tables can alter the audit trail

Pros and Cons of External Auditing

Pros 1.Provides greatest ability to track activity for those that have database access 2.Depending on the architecture of the storage of the data, this audit trail typically cannot be altered by high-risk employees 3.Use of third party tools to extract such data can virtually eliminate risk of corruption of audit trail. Cons 1.Depending on the database version and abilities of staff, extracting the correct data via software like LogMiner can be challenging 2.Purchase of third party tool or use of consultants to develop may be required 3.Additional hardware may be required to support the auditing

Pros and Cons of Standard Application Auditing

Pros 1.Standard part of the application configuration 2.No performance impact Cons 1.Does not provide detail on what gets changed in most cases - just who and when a record is updated 2.Does not provide the level of detail required by an auditor 3.Does not track enough detail to be able to reconstruct activity in the case of fraud

Standard Application Auditing

Provides last update WHO and WHEN information. -Detailed tracking from point of initial record creation to last update is lost. -Database stores who/when the record was created and who/when the record was last updated (unless exceptions like HR where date time stamping used). -Database doesn't store WHAT was changed on the record. -Application can store when a user logs in, what responsibilities or roles they use and what forms they access (however, this auditing is not typically enabled by default.)

Procure-to-Pay Document flow

Purchase Requisition (internal) Request for Quote Purchase Order Product Receipt -Vendor Packing Slip Invoice Receipt -Approved Vendor Invoice Vendor Payment -Vendor Statement

Application Audits can be apart of

Requires understanding of data flows, Input, Process, and Output controls, and application functionality 1) Range from simple to complex applications Excel spreadsheets ERP systems 2) May be in conjunction with business process audit Example: Payroll applications in payroll process audit 3) Various phases of the application's life cycle -System development -Post-Implementation -Recurring, regularly scheduled basis

Authentication Risks and Controls

Risks: -Credentials sniffed or stolen. -Credentials replayed, even if they are encrypted. -Users select poor passwords. -Third-party compromised or untrustworthy. Controls: -Use of hashed credentials, not credentials themselves. -Userid/password logging, behavior analysis -Encrypted transmission of sensitive information. -Time-stamped transmissions to prevent replay attacks. -Third-party assessment for trustworthiness and security. -Multi-factor authentication

Authorization Risks and Controls

Risks: -Excessive access or rights to application -Excessive rights within applications -Gain of excessive access by increasing authorization level -Access or rights slow to update upon job change/termination -Proxies who are not controlled Controls: -Access control lists, ensure different user levels created -Job roles/job descriptions match to ACL -Privileges identified for each job role/description -Enforcement of privileges provided to each job role/description -Application hardening to remove possibility of bypassing authorization mechanisms to elevate user levels -Job change/termination policies and procedures

User management

Set up user accounts, assign security roles, assign permissions on organizational level (restrict access at organizational level), assign relations (is user employee, external party or contractor?)

CAP Theorem

States that it is impossible for a distributed computer system to simultaneously provide all three of the following guarantees (can only pick 2) 1) Consistency -A distributed system is typically considered to be consistent if after an update operation of some writer all readers see his updates in some shared data source 2) Availability -A system is designed and implemented in a way that allows it to continue operation 3) Partition tolerance -These occur if two or more "islands" of network nodes arise which (temporarily or permanently) cannot connect to each other; dynamic addition and removal of nodes

Integrating with Subledgers Process

Sub ledgers GL_INTERFACE Journal imports Journals Post GL_Balances -Transferring Information from Oracle Subledgers -2-step process: Data is pushed into the GL_INTERFACE table from the subledger using a transfer program. §Journal Import pulls the information from the interface table to create valid, postable journal entries in General Ledger. § §Journal Import Process is optional when initiating transfer programs from subledgers (e.g., Payables, Receivables) §If not, Journal Import must be run separately in General Ledger, using the Import Journals window, in order to create postable journal entries. -Importing journal entries from non-Oracle applications -Use Journal Import to automatically import budget, actual, and encumbrance data from your non-Oracle applications. -You can create SQL Loader script to load the data into the interface. -You can also check funds and reserve them for your imported transactions. -Applications Desktop Integrator (ADI) -You can enter data in a spreadsheet and upload to General Ledger: Prepare and analyze budgets. Prepare functional and foreign journal entries.

Granularity

The level of detail represented by the data -For example §Yearly, monthly, daily, hourly §Country, region, state, county, zip code §Company-wide, division, department, group, individual

Star Schema

The most commonly used and the simplest style of dimensional modeling

Organizational Logical Security (IAM) consists of ...

The organization wide standards for Identity and Access Management; -Identification and authentication of users/systems/devices -Authorization and access rights -Procedures for adding, changing, removing users/systems/devices -Procedures for adding, changing, removing rights ((1.Management and process 2.Linking people/apps to identities, identities to rights/privileges 3.Security credentials, passwords, tokens, keys, biometrics 4.Logs, monitoring, reporting))

Data discovery

The process of identifying data and where it resides

Data Classification

The process of tagging and organizing data into categories to determine how it's handled

Session Management Risks and Controls

The series of communication interactions back/forth between and an individual/client and the application Risks: -Sessions can manipulated by end users to elevate privileges or impersonate others. -Sessions can be sniffed/stolen leading to impersonation. -Session IDs maybe predictable allowing attackers to impersonate other users. -Session IDs may be sniffed. Controls -Encrypt the contents of the session data to prevent manipulation. -Implement checks on session IDs to ensure they have not been tampered with. -Avoid storing authentication credentials on client. Server side storage of data is more secure. -Session IDs should be random preventing its prediction. -Use SSL or other encryption methods to prevent communication sniffing. -Forced logoffs/time-out to terminate processing

By Reference

When the address of Objects are sent to methods, allowing the originals to be changed.

General Risks

§Loss or underutilization of assets §Invalid or incorrect transactions §Loss of reputation §Need for compensating controls §Reduced system availability §Questionable integrity of information §Inability to satisfy regulatory/assurance requirements

Processing Controls

§Recalculations §Program design §Edit checks / run totals §Before / after comparisons §Segregation of duties §Transaction logs / audit trails §Error Reporting/Handling §Duplicate transaction handling §User training

Advantages of Big Data

§Scalability §Flexible schema and data types §Fast look ups §No single point of failure §Massive read/write performance §Fast prototyping and development §Easy maintenance

Output Controls

§User review §Reconciliation §Report distribution §Report receipt §Retention policies §System interfaces §Error/exception management

Manual process controls

—Control activities performed without the assistance of applications or automated systems. -Examples: supervisory controls; written authorizations, such as a signature on a check; or manual tasks, such as reconciling purchase orders to goods receipt statements. -Inherent risk of human error and, as a result, are often considered less reliable than automated controls.

Automated application controls

—Controls that can be programmed and embedded within an application. -Examples: input edit checks that validate order quantities and check digits that validate the correctness of bank account numbers.

Hybrid controls

—Controls that consist of a combination of manual and automated activities, all of which must operate for the control to be effective. -Example: order fulfillment process might include a control whereby the shipping manager reviews a report of unshipped orders. Both the automated activity (generation of a complete and accurate unshipped orders report) and the manual activity (review and follow up by management) are necessary for the control activity to be effective. -Risk associated with hybrid or computer-dependent controls inappropriately identified as being manual controls. -Because of the need for all parts of a hybrid control to be effective, there is a significant risk of key elements of the true control not being considered as part of the overall design effectiveness, if such controls are incorrectly identified. -Example: If the review of the unshipped orders report in the previous example was incorrectly identified as a manual control, there is a risk that the design of controls to ensure the completeness and accuracy of the unshipped orders report may be overlooked

Configurable controls

—Typically, automated controls that are based on and, therefore, dependent on the configuration of parameters within the application system. -Example: a control in automated purchasing systems that allows orders only up to preconfigured authorization limits is dependent on controls over changes to those configurable authorization limits. -Application systems are heavily dependent on the configuration of various parameter tables. In these cases, it may be appropriate to consider the design of controls over the configuration tables as a separate element of the control design.

Data That needs to be "Treated with Care"

•Authentication data •Sensitive Financial Information •Proprietary Data •Export Controlled Materials •Federal Tax Information ("FTI") -Payment Card Information •Protected Personally Identifiable Information •Protected Health Information ("PHI") /Electronic Protected Health Information ("EPHI") - •Controlled Technical Information ("CTI") -Data that falls under the Gramm-Leach-Billey Act -Data the falls under State and Federal data privacy laws

General Questions to ask during an ERP Audit

•If financial, does the system process according to GAAP (Generally Accepted Accounting Principles) and GAAS (Generally Accepted Auditing Standards)? •Does it meet the needs for reporting, whether regulatory or organizational? •Were adequate user requirements developed through meaningful interaction? •Does the system protect confidentiality and integrity of information assets? •Does it have controls to process only authentic, valid, accurate transactions? •Are effective system operations and support functions provided? •Are all system resources protected from unauthorized access and use? •Are user privileges based on what is called "role-based access?" •Is there an ERP system administrator with clearly defined responsibilities? •Is the functionality acceptable? Are user requirements met? Are users happy? •Have workarounds or manual steps been required to meet business needs? •Are there adequate audit trails and monitoring of user activities? •Can the system provide management with suitable performance data? •Are users trained? Do they have complete and current documentation? •Is there a problem-escalation process?


Conjuntos de estudio relacionados

Exam 1 Chapter 3: The Marketing Environment

View Set

Global History Review 1st Semester

View Set

Developed vs Developing Countries

View Set

5th Grade Module 2 Topic Quiz E/F/G Review

View Set

Cultural Competence/Diversity Part 2

View Set