CISSP-AppSecurity-11

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

Two Phase Commit

Ensures integrity of the database by sending out a pre-commit to each database so they acknowledge the need for a commit, then the commit is sent to ensure the information is stored at the right time in the right places.

Post-validation

Ensuring an application's output is consistent with expectations (that is, within predetermined constraints of reasonableness).

EJB

Enterprise Java Beans a structural design for the development and implementa- tion of distributed applications written in Java. EJB provides interfaces and methods to allow different applications to be able to communicate across a networked environ- ment. By using the Internet Inter-ORB Protocol (IIOP), the client portion does not have to be a program written in Java, but can be any valid CORBA client.

What is an advantage of content-dependent access control in databases?

Granular control.

Which action is not part of configuration management?

Submitting a formal request

What is the purpose of polyinstantiation?

To make a copy of an object and modify the attributes of the second copy

Expert systems are used to automate security log review for what purpose?

To detect intrusion

Which of the following are rows and columns within relational databases?

Tuples and attributes

Verification vs Validation

Verification determines if the product accurately represents and meets the specifications. Validation determines if the product provides the necessary solution for the intended real-world problem.

Online application systems that detect an invalid transaction should do which of the following?

Write a report to be reviewed.

Changes must be

authorized, tested, and recorded. The changes must not affect the security level of the system or its capability to enforce the security policy.

data modeling

considers data independently of the way the data are processed and of the components that process the data. A data model follows an input value from beginning to end and verifies that the output is correct.

Data Mining

process of massaging data from a data warehouse into more useful information.

Source code

translated into machine code, or object code, by compilers, assemblers, and interpreters.

self-garbling virus

tries to escape detection by changing, or garbling, its own code.

polymorphic virus

tries to escape detection by making copies of itself and modifying the code and attributes of those copies.

DCE uses

universal unique identifiers (UUIDs) to keep track of different subjects, objects, and resources. They also use inference engine processing, automatic logical processing, and general methods of searching for problem solutions.

Pre-validation

Input controls verifying data is in appropriate format, and compliant with application specifications, prior to submission to the application. An example of this would be form field validation, where web forms do not allow letters in a field that is expecting to receive a number (currency) value.

Database views provide what type of security control?

Preventive

Polyinstantiation

Process of interactively producing more detailed versions of objects by by populating variables with different values; used often to prevent inference attacks. Like the case of the top secret ship "Oklahoma" on a *food* mission to Africa.

Which of the following replicates itself by attaching to other programs?

A virus

Which form of malware is designed to reproduce itself by utilizing system resources?

A worm

When should security first be addressed in a project?

During requirements development

When a database detects an error, what enables it to start processing at a designated place?

A checkpoint

Which of the following centrally controls the database and manages different aspects of the data?

A data dictionary

A view

an access control mechanism used in databases to ensure that only authorized subjects can access sensitive information.

Virus

an application that requires a host application for replication.

A good patch management process follows a structured six-step methodology.

1. Infrastructure 2. Research 3. Assess and Test 4. Mitigation and Rollback 5. Deployment 6. Validation

An application is downloaded from the Internet to perform disk cleanup and to delete unnecessary temporary files. The application is also recording network login data and sending it to another party. This application is best described as which of the following?

A Trojan horse

structured analysis approach

A full-structured analysis approach looks at all objects and subjects of an ap- plication and maps the interrelationships, communications paths, and inheritance properties

Which of the following is used in the Distributed Computing Environment technology?

A universal unique identifier (UUID)

What is the final phase of the system development life cycle?

Accreditation

ADO

ActiveX Data Objects - An API that allows applications to access back-end database systems. It is a set of ODBC interfaces that exposes the functionality of data sources through accessible objects. ADO uses the OLE DB interface to connect with the database and can be developed with many different scripting languages. It is commonly used in web applications and other client/server applications. The following are some characteristics of ADO: • It's a high-level data access programming interface to an underlying data access technology (such as OLE DB). • It's a set of COM objects for accessing data sources, not just database access. • It allows a developer to write programs that access data, without knowing how the database is implemented. • SQL commands are not required to access a database when using ADO.

Java Database Connectivity (JDBC)

An API that allows a Java application to communicate with a database. The application can bridge through ODBC or directly to the database. The following are some characteristics of JDBC: • It is an API that provides the same functionality as ODBC but is specifically designed for use by Java database applications. • It has database-independent connectivity between the Java platform and a wide range of databases. • JDBC is a Java API that enables Java programs to execute SQL statements.

If security was not part of the development of a database, how is it usually handled?

By a trusted front end

CORBA

Common Object Request Broker Architecture is an open object-oriented standard architecture that provides interoperability among the vast array of software, platforms, and hardware in environments today. It enables applications to communicate with one another no matter where the applications are located or who developed them, and provides standards to build a complete distributed environment.

CASE

Computer-aided software engineering (CASE) involves the use of tools to create and manage software. A CASE tool is aimed at supporting one or more software engineering tasks and activities in the process of developing software. It applies engineering principles to the development and analysis of specifications using specific tools.

If one department can view employees' work history and another group cannot view their work history, what is this an example of?

Content-dependent access control

Expert systems use each of the following items except for...

Cycle-based reasoning

Software escrow

In a software escrow, a third party keeps a copy of the source code, and possibly other materials, which it will release to the cus- tomer only if specific circumstances arise, mainly if the vendor who developed the code goes out of business or for some reason.

Project Initiation

In this phase, user needs are identified and the basic security objectives of the prod- uct are acknowledged. It must be determined if the product will be processing sensitive data, and if so, the levels of sensitivity involved should be defined. Issues pertaining to security integrity, confidentiality, and availability need to be addressed. A basic security framework is designed for the project to follow, and risk manage- ment processes are established.

What is a disadvantage of using context-dependent access control on databases?

It increases processing and resource overhead.

Which best describes a logic bomb?

It's an action triggered by a specified condition.

KDD

Knowledge Discovery in Database - also known as Data Mining.

OOA and OOD

Object-oriented analysis (OOA) is the process of classifying objects that will be appropriate for a solution. A problem is analyzed to determine the classes of objects to be used in the application. Object-oriented design (OOD) creates a representation of a real-world problem and maps it to a software solution using OOP. The result of an OOD is a design that modularizes data and procedures. The design interconnects data ob- jects and processing operations.

Which of the following is used to deter database inference attacks?

Partitioning, cell suppression, and noise and perturbation

Common phases of system and software development along with the core security tasks that take place at each phase

Project initiation • Conception of project definition • Proposal and initial study • Initial risk analysis Functional design analysis and planning • Requirements uncovered and defined • System environment specifications determined • Formal design created System design specifications • Functional design review • Functionality broken down • Detailed planning put into place • Codedesign Software development • Developing and programming software Installation • Product installation and implementation • Testing and auditing Maintenance support • Product changes, fixes, and minor modifications Disposal • Replace product with new product

What is the final stage in the change control management process?

Report the change to management.

A system has been patched many times and has recently become infected with a dangerous virus. If antivirus software indicates that disinfecting a file may damage it, what is the correct action?

Restore an uninfected version of the patched file from backup media.

Database software performs three main types of integrity services

Semantic, referential, and entity

Checkpoints

Similar to Savepoints. When the database software fills up a certain amount of memory, a checkpoint is initiated, which saves the data to a temp file.

CORBA contains two main parts

System-oriented components (object request brokers [ORBs] and object services) and application-oriented components (application objects and common facilities). The ORB manages all communications between components and enables them to interact in a heterogeneous and distributed environment. ORB is the middleware that establishes the client/server relationship between objects.

Abstraction

The capability to suppress unnecessary details so the important, inher- ent properties can be examined and reviewed. It enables the separation of conceptual aspects of a system.

What is the importance of inference in an expert system?

The knowledge base contains facts, but must also be able to combine facts to derive new information and solutions.

Why are macro viruses so prevalent?

The languages used to write macros are very easy to use.

Relational vs object-oriented database

The relational database does not actually provide procedures, as object-oriented databases do. The object-oriented database has classes to define the attributes and procedures of its objects.

Security testing

a comprehensive analysis technique that tests programs under artificially created attack scenarios.

Most databases have

a data definition language (DDL), a data manipulation language (DML), a query language (QL), and a report generator.

In relational database terminology

a database row is called a tuple.

An expert system uses

a knowledge base full of facts, rules of thumb, and expert advice. It also has an inference machine that matches facts against patterns and determines which rules are to be applied.

Coupling

a measurement that indicates how much interaction one module re- quires to carry out its tasks. If a module has low (loose) coupling, this means the module does not need to communicate to many other modules to carry out its job. High (tight) coupling means a module depends upon many other modules to carry out its tasks.

Object-oriented design represents

a real-world problem and modularizes the problem into cooperating objects that work together to solve the problem.

data structure

a representation of the logical relationship between elements of data. It dictates the degree of association between elements, methods of access, processing alternatives, and the organization of data elements.

Java security employs

a sandbox so the applet is restricted from accessing the user's hard drive or system resources. Programmers have figured out how to write applets that escape the sandbox.

Iterative development

a software development method that follows a cyclic approach to software development.

Waterfall development

a software development method that is a classical method using discrete phases of development that require formal reviews and documentation before moving into the next phase of the project.

Spiral development

a software development method that is a method that builds upon the waterfall method with an emphasis on risk analysis, prototypes, and simulations at different phases of the development cycle. This method periodically revisits previous stages to update and verify design requirements.

OOA is an example of...

a structured analysis approach

Aggregation can happen if

a user does not have access to a group of elements, but has access to some of the individual elements within the group. Aggregation happens if the user combines the information of these individual elements and figures out the information of the group of data elements, which is at a higher sensitivity level.

Polyinstantiation is the process of

allowing a table to have multiple rows with the same primary key. The different instances can be distinguished by their security levels or classifications.

SOAP

allows programs created with different programming languages and running on different operating systems to interact without compatibility issues.

High-level programming languages

are translated into machine languages for the system and its processor to understand.

Smurf and Fraggle

are two examples of DoS attacks that take advantage of protocol flaws and use amplifying networks.

Artificial neural networks (ANNs)

attempt to mimic a brain by using units that react like neurons.

Database integrity is provided by

concurrency mechanisms. One concurrency control is locking, which prevents users from accessing and modifying data being used by someone else.

Multipart viruses

can have one part of the virus in the boot sector and another part of the virus on the hard drive.

A rollback

cancels changes and returns the database to its previous state. This takes place if there is a problem during a transaction.

Common attempts to prevent inference attacks

cell suppression, partitioning the database, and noise and perturbation

Data warehousing

combines data from multiple databases and data sources.

Macro Viruses

common because the languages used to develop macros are easy to use and they infect Office products, which are everywhere.

Attack surface analytics

created once a threat model has been created. The use of attack surface analysis techniques provides a structured process for analyzing program entry points as well. Attack surface analytics focus on documenting possible entry points irrespective of their defined privileges.

Certification

deals with testing and assessing the security mechanism in a system, while accreditation pertains to management formally accepting the system and its associated risk.

Capability Maturity Model (CMM)

describes procedures, principles, and practices that underlie software development process maturity. This model was developed to help software vendors improve their development processes that provides policies, procedures, guidelines, and best practices to allow an organization to develop a standardized approach to software development that can be used across many different groups and assess a vendor's development consistency and quality.

object-oriented database

designed to handle a variety of data (images, audio, documents, video). An object-oriented database management system (ODBMS) is more dynamic in nature than a relational database, because objects can be created when need- ed and the data and procedure (called method) go with the object when it is requested.

Systems and applications can use different development models that utilize

different life cycles, but all models contain project initiation, functional design analysis and planning, system design specifications, software development, installation, operations and maintenance, and disposal in some form or fashion.

A programmer should not have

direct access to code in production. This is an example of separation of duties.

Partitioning

dividing the database into different parts to make it harder for an attacker to put the pieces together

A worm

does not require a host application to replicate.

Object linking and embedding (OLE)

enables a program to call another program (linking) and permits a piece of data to be inserted inside another program or document (embedding).

Dynamic Data Exchange (DDE)

enables applications to work in a client/server model by providing the interprocess communication (IPC) mechanism.

The best programming design...

enables objects to be as independent and modular as possible; therefore, the higher the cohesion and the lower the coupling, the better.

Open Database Connectivity (ODBC)

enables several different applications to communicate with several different types of databases by calling the required driver and passing data through that driver.

OLTP

ensures transactions happen properly across all databases, or they don't happen at all.

Entity integrity

ensures tuples are uniquely identified by primary key values.

A logic bomb

executes a program when a predefined event takes place, or a date and time are met.

A Method

functionality that an object can carry out.

Data and operations internal to objects are...

hidden from other objects, which is referred to as data hiding. Each object encapsulates its data and processes.

Stealth virus

hides its tracks and its actions

A database primary key

how a specific row is located from other parts of the database.

A checkpoint is used

if there is a system failure or problem during a transaction. The user is then returned to the state of the last checkpoint.

Security should be addressed...

in each phase of system development. It should not be addressed only at the end of development, because of the added cost, time, and effort and the lack of functionality.

The two largest security problems associated with database security

inference and aggregation

A data dictionary

is a central repository that describes the data elements within a database and their relationships. A data dictionary contains data about a database, which is called metadata.

A Trojan horse

is a program that performs useful functionality and malicious functionally without the user knowing it.

Inference

is the capability to derive information that is not explicitly available.

If an application fails

it should go directly to a secure state.

ANNs can

learn from experiences and can match patterns that regular programs and systems cannot.

If an object does not require much interaction with other modules, it has

low coupling

Semantic integrity

makes sure structural rules are enforced

Entity integrity

makes sure that a row, or tuple, is uniquely identified by a primary key, and referential integrity ensures that every foreign key refers to an existing primary key.

An object request broker (ORB)

manages communications between objects and enables them to interact in a heterogeneous and distributed environment.

Objects

members, or instances, of classes. The classes dictate the objects' data types, structure, and acceptable actions. They communicate with each other through messages.

If proper design for a product is not put into place in the beginning

more effort will have to take place in the implementation, testing, and maintenance phases.

Change control

needs to be put in place at the beginning of a project and must be enforced through each phase.

Common attempts to prevent inference attacks

partitioning the database, cell suppression, and adding noise to the database.

To prevent aggregation

prevent the user or process from accessing the individual components of the whole information.

Data-mining tools

produce metadata, which can contain previously unseen relationships and patterns.

Common Object Request Broker Architecture (CORBA)

provides a standardized way for objects within different applications, platforms, and environments to communicate. It accomplishes this by providing standards for interfaces between objects.

Component Object Model (COM)

provides an architecture for components to interact on a local system. Distributed COM (DCOM) uses the same interfaces as COM, but enables components to interact over a distributed, or networked, environment.

Object-oriented programming

provides modularity, reusability, and more granular control within the programs themselves.

Distributed Computing Environment (DCE)

provides much of the same functionality as DCOM, which enables different objects to communicate in a networked environment.

Cohesion

reflects how many different types of tasks a module can carry out. If a module carries out only one task (subtraction of values) or several tasks that are very similar (subtract, add, multiply), it is described as having high cohesion, which is a good thing. The higher the cohesion, the easier it is to update or modify and not affect other modules that interact with it. This also means the module is easier to reuse and maintain because it is more straightforward when compared to a module with low co- hesion. A model with low cohesion carries out multiple different tasks and increases the complexity of the module, which makes it harder to maintain and reuse.

Separation of duties should be practiced in

roles, environments, and functionality pertaining to the development of a product.

Content-dependent access control

rules based on the sensitivity of data to prevent aggregation and inference.

Context-dependent access control

software control that keeps track of a user's sequence of access steps and may or may not allow access to the data based on rules.

Risk management and assessments should

start at the beginning of a project and continue throughout the lifetime of the product.

Distributed Component Object Model (DCOM)

supports the same model for com- ponent interaction, and also supports distributed IPC. COM enables applications to use components on the same systems, while DCOM enables applications to access objects that reside in different parts of a network. So this is how the client/server-based activities are carried by COM-based operating systems and/or applications.

A commit statement

terminates a transaction and saves all changes to the database.

Aggregation

the act of combining information from separate sources where you have permission when you don't have to all of the information - the combined info is more sensitive than the individual parts.

Data mining

the process of massaging data held within a data warehouse to provide more useful information to users.

A database management system (DBMS)

the software that controls the access restrictions, data integrity, redundancy, and the different types of manipulation available for a database.

Noise and Perturbation

the technique of inserting bogus information to mislead an attacker

Cell suppression

the technique used to hide specific cells that contain information

Objects can communicate properly because...

they use standard interfaces.

Inference

through aggregation you can deduce the full amount of information from the components gathered.

Savepoints

used to ensure if a system failure occurs that the database can return to a point before crash.

Expert systems

used to mimic human reasoning and replace human experts.

ActiveX

uses a security scheme that includes digital signatures. The browser security settings determine how ActiveX controls are dealt with.

A hierarchical database

uses a tree-like structure to define relationships between data elements, using a parent/child relationship.

A relational database

uses two-dimensional tables with rows (tuples) and columns (attributes).

Information Gathering

usually the first step in an attacker's methodology. Information gathered may allow an attacker to infer additional information that can be used to compromise systems.

Boot sector

virus overwrites data in the boot sector and can contain the rest of the virus in a sector it marks as "bad."

Referential integrity

when all foreign keys reference primary keys with existing record.

Polymorphism

when different objects are given the same input and react differently.

Parameter validation

where the values that are being received by the applica- tion are validated to be within defined limits before the server application processes them within the system. The main difference between parameter validation and input validation would have to be whether the application was expecting the user to input a value as opposed to an environment variable that is defined by the application.

ACID Test

• Atomicity Divides transactions into units of work and ensures that all modifications take effect or none takes effect. Either the changes are committed or the database is rolled back. • Consistency A transaction must follow the integrity policy developed for that particular database and ensure all data are consistent in the different databases. • Isolation Transactions execute in isolation until completed, without interacting with other transactions. The results of the modification are not available until the transaction is completed. • Durability Once the transaction is verified as accurate on all systems, it is committed, and the databases cannot be rolled back.

Three approaches used in KDD systems

• Classification Groups together data according to shared similarities. • Probabilistic Identifies data interdependencies and applies probabilities to their relationships. • Statistical Identifies relationships between data elements and uses rule discovery.

Software requirements come from three models

• Informational model Dictates the type of information to be processed and how it will be processed • Functional model Outlines the tasks and functions the application needs to carry out • Behavioral model Explains the states the application will be in during and after specific transitions take place

CMM Staged Maturity Levels

• Initial Development process is ad hoc or even chaotic. The company does not use effective management procedures and plans. There is no assurance of consistency, and quality is unpredictable. • Repeatable A formal management structure, change control, and quality assurance are in place. The company can properly repeat processes throughout each project. The company does not have formal process models defined. • Defined Formal procedures are in place that outline and define processes carried out in each project. The organization has a way to allow for quantitative process improvement. • Managed The company has formal processes in place to collect and analyze qualitative data, and metrics are defined and fed into the process- improvement program. • Optimizing The company has budgeted and integrated plans for continuous process improvement.

Malware elements

• Insertion Installs itself on the victim's system • Avoidance Uses methods to avoid being detected • Eradication Removes itself after the payload has been executed • Replication Makes copies of itself and spreads to other victims • Trigger Uses an event to initiate its payload execution • Payload Carries out it its function (that is, deletes files, installs a backdoor, exploits a vulnerability, and so on)

Phases used for system and application development models include...

• Projectinitiation • Functional design analysis and planning • System design specifications • Softwaredevelopment • Installation/implementation • Operational/maintenance • Disposal

name the list of database terms

• Record A collection of related data items. • File A collection of records of the same type. • Database A cross-referenced collection of data. • DBMS Manages and controls the database. • Tuple A row in a two-dimensional database. • Attribute A column in a two-dimensional database. • Primary key Columns that make each row unique. (Every row of a table must include a primary key.) • View A virtual relation defined by the database administrator in order to keep subjects from viewing certain data. • Foreign key An attribute of one table that is related to the primary key of another table. • Cell An intersection of a row and column. • Schema Defines the structure of the database. • Data dictionary Central repository of data elements and their relationships.

Some of the most common testing approaches

• Unit testing Individual component is in a controlled environment where programmers validate data structure, logic, and boundary conditions. • Integration testing Verifying that components work together as outlined in design specifications. • Acceptance testing Ensuring that the code meets customer requirements. • Regression testing After a change to a system takes place, retesting to ensure functionality, performance, and protection.


Kaugnay na mga set ng pag-aaral

Chapter 5 - Project Scope Management

View Set

Chapter 14: Direct, Online, Social Media, and Mobile Marketing

View Set

Digital Forensics and Incident Response

View Set

A&P I: Unit 3: Module 6.2: CNS (Brain)

View Set

Pharmacology Exam 4 Questions: Antibiotics

View Set

Econ-E201: Intro to Microeconomics Exam 1

View Set