Chapter 20 Software Development Security

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

Concurrency

or edit control, is a preventive security mechanism that endeavors to make certain that the information stored in the database is always correct or at least has it integrity and availability protected. This feature can be employed on a singe-level or multilevel database.

Consistency

All transactions must begin operating in an environment that is consistent with all of the database's rules. When the transaction is complete, the database must again be consistent with the rules, regardless of whether those rules were violated during the processing of the transaction itself. No other transaction should ever be able to use any inconsistent data that might be generated during the execution of another transaction.

durability

Data base transactions must be durable. That is, once they are committed to the database, they must be preserved. Databases ensure durability through the use of backup mechanisms such as transaction logs.

Atomicity

Database transactions must be atomic - that is, they must be ab "all-or-nothing" affair. If any part of the transaction fails, the entire transaction must be rolled back as if it never occurred.

continuous integration/continuous delivery

Organizations using the DevOps model often deploy codes several times per day. Some orgs even strive to reach the goal of CI/CD, where code may roll out dozens or even hundreds of times per day. This requires a high degree of automation, including integrating code repositories, the software configuration management process, and the movement of code between development, testing, and production environment's

reasonableness check

ensures that values returned by software match specified criteria that are within reasonable bounds. For example, a routine that calculated optimal weight for a human being and returned a value of 612 pounds would certainly fail a reasonableness checks.

inheritance

inheritance occurs when methods from a class(parent or superclass) are inherited by another subclass(child) or object

The Open Web Application Security Project (PWASP) Secure Coding Practices suggest logging the following events

input validation failures authentication attempts, especially failures access control failures tampering attempts use of invalid or expired session tokens exceptions raised by the operating system or applications use of administrative privileges transport layer security failures cryptographic errors

Primary keys

selected from the set of candidate keys for a table to be used to uniquely identify the records in a table. Each table has only one primary key, selected by the database designer from the set of candidate keys. The RDBMS enforces the uniqueness of primary keys by disallowing the insertion of multiple records with the same primary key.

cell suppression

the concept of hiding individual database fields or cells imposing more security restrictions on them.

database contamination

Mixing data with different classification levels and/or need-to-know requirements

keys (Relationship Databases)

Records are identified using a variety of keys, which are a subset of the fields of a table and are used t uniquely identify the records. They are also used to join tables when you wish to cross-reference information.

The Change Management Process has three basic components:

Request Control Change Control Release Control

Systems Development Lifecycle Phases

Conceptual Definition Functional requirements determination Control specifications development Design review Coding Code review walk-through System test review Maintenance and change management

DevOps Approach

- Seeks to resolve issues of software development, quality assurance, and technology operations by bringing the three functions together in a single operational model. - The name symbolizes that the functions must merge and cooperate to meet business requirements. - Closely aligned with the Agile development approach. - Aims to dramatically decrease the time required to develop, test, and deploy software changes. the tight integration of development and operations also calls for the simultaneous integration of security controls. If code is being rapidly developed and moved into production, security must also move with that same agility

Issues commonly addressed in SLAs

- System uptime (as a percentage of overall operating time) - Maximum consecutive downtime (in secords/minutes/and so on) - Peak load - Average load - Responsibility for diagnostics - Failover time (if redundancy is in place) SLAs also commonly included financial and other contractual remedies that kick in if the agreement is not maintained. In these situations, the service provided and customer both carefully monitor performance metrics to ensure compliance with the SLA.

Compiler

A program that translates code in a high-level language (such as Java) to machine instructions (such as bytecode for the Java virtual machine). A tool used to convert source code from a higher-level language into an executable file designed for use on a specific operating system.

Candidate Key

A subset of attributes, columns, or fields that can be used to uniquely identify any record in a table. No two records in the same table will ever contain the same values for all attributes composing a candidate key. Each table may have one or more candidate keys.

Expert Systems

Computerized advisory programs that imitate the reasoning processes of experts in solving difficult problems. Every expert system has two main components: the knowledge base and the inference engine. The knowledge base contains the rules known by an expert system. The knowedge base seeks to codify the knowledge of human experts in a series of "if/then" statements The inference engine analyzes information in the knowledge base to arrive at the appropriate decision. The expert system user employs some sort of user interface to provide the inference engine with details about the current situation, and the inference engine uses a combination of logical reasoning and fuzzy logic techniques to draw a conclusion based on past experience. Expert systems are not infallible __ they're only as good as the data in the knowledge base and the decision-making algorithms implemented in the inference engine. However they have one major advantage in stressful situations -- their decisions do not involve judgement clouded by emotion.

Security vs User Friendly vs Functionality

Even when security is properly designed and embedded in software, that security is often disabled in order to support easier installation. Thus, it is common for the IT administrator to have the responsibility of turning on and configuring security to match the needs of their specific environment. Maintaining security is often a trade-off with user-friendliness and functionality. Additionally, as you add or increase security, you will also increase costs, increase administrative overhead, and reduce productivity/throughput

White Box Testing

Examines the internal logical structures of a program and steps through the code line by line, analyzing the program for potential errors. The key attribute of this test is that the testers have access to the source code.

Black Box Testing

Examines the program from a user perspective by providing a wide variety of input scenarios and inspecting the output. These type of testers do not have access to the internal code. Final acceptance testing teat occurs prior to system delivery is a common example of this testing.

Content-dependent access control

Example of granular object. It is based on the contents or payload of the object being assessed. Because decisions must be made on an object-by-object basis, content-dependent increases processing overhead.

semantic integrity

Ensures that user actions don't violate any structural rules. It also checks that all stored data types are within valid domain ranges, ensures that only logical values exist, and confirms that the system complies with any and all uniqueness constraints. It also checks that all stored data types are within valid domain ranges, ensures that only logical values exist, and confirms that the system complies with any and all uniqueness constraints.

Input Validation

the process that verifies that the values provided by a user match the programmer's expectation before allowing further processing. If a input value falls outside the specified range, the program will not try to process and will inform the user of the input expectations. Input validation also may check for unusual characters, such as quotation marks within a text field, which may be indicative of an attack. In some cases, the input validation routine can transform the input to remove risky character sequences and replace them with safe values. This process, know as escaping input, is performed by replacing occurrences of sensitive characters with alternative code that will render the same to the end user but will not be executed by the system.

Behavior

the results or output exhibited by an object. They are the results of a message being processed through a method

NoSQL

A new generation of database management systems that is not based on the traditional relational database model.

Code repositories

Act as a central storage point for developers to place their source code, In addition, code repositories such as GitHub, Bitbucket, and SourceForge also provide version control, bug tracking, web hosting, release management, and communications functions that support software development. Code repositories are often integrated with popular code management tools. For example, the git tool is popular among many software developers and it is tightly integrated with GitHub and other repositories. Repository owner must carefully design access controls to only allow appropriate users read and/or write access. Improperly granting users read access may allow unauthorized individuals to retrieve sensitive information, where as improperly granting write access may allow unauthorized tampering with code.

spiral model

Because this model encapsulates a number of iterations of another model(the waterfall model), it is known as the metamodel, or a "model of models Each lop of the spiral represent in the development of a new system prototype. Theoretically, system developers would apply the entire waterfall process to the development of each prototype, thereby incrementally working toward a mature system that incorporates all the functional requirements in a fully validate fashion. Barry's model provides a solution to the major criticism of the waterfall model _ its allows developers to return to the planning stages as changing technical demands and customer requirements necessitate the evolution of a system. The water fall model focuses on a large-scale effort to deliver a finished system, whereas the spiral model focuses on iterating through a series of increasingly "finished" prototypes that allow for enhanced quality control.

Agile Software Development

Beginning in the 1990s, developers increasingly embraced approaches to software development that eschewed the rigid models of the past in favor of approaches that placed an emphasis on the needs of the customer and on quickly developing new functionality that meets the needs in an iterative fashion. Values Individuals and interactions over processes and tools working software over comprehensive documentation customer collaboration over contract negotiation responding to change over following a plan Agile is a philosophy and not a specific methodology, Several specific methodologies have emerged that take these Agile principles define that implement them: Scrum, Kanban, Rapid Application Development, Agile Unified Process(AUP), the Dynamic Systems Development Mode(DSDM)l, and Extreme Programming(XP)

Concurrency as a preventive control

Concurrency uses a "lock feature to allow one user to make changes but deny other users access to views or make changes to data elements at the same time. Then, after the changes have been made, an "unlock" feature restores the ability of other users to access the data they need. In some instances, administrators will use concurrency with auditing mechanisms to track document and/or field changes. When this recorded data is reviewed, concurrency becomes a detective control.

Capability Maturity Model for Software/ Software Capability Maturity Model (SW-CMM, CMM, or SCMM)

Contends that all organizations engaged in software development move through a variety of maturity phases in sequential fashion. This model describes the principles and practices underlying software process maturity.. Intended to help software organizations improve the maturity and quality of their software processes by implementing an evolutionary path from ad hoc, chaotic processes to mature, disciplined software processes. The idea behind this model is the quality of the software depends on the quality of its development processes. This model does not explicitly address security, but it is the responsibility of cybersecurity professionals and software developers to ensure that security requirements are integrated into the software development effort.

SAMM divides the software development process into five business functions:

Governance - The activities an organization undertakes to manage its software development process. This function includes practices for strategy, metrics, policy, compliance, education, and guidance. Design - The process used by the organization to define software requirements and create software. This function includes practices for threat modeling, threat assessment, security requirements, and security architecture. Implementation - The process of building and deploying software components and managing flaws in those components. This function includes the secure build, secure deployment, and defect management practices. Verification - The set of activities undertaken by the organization to confirm that code meets business and security requirements-drive testing, and security testing. Operations - The actions taken by an organization to maintain security throughout the software lifecycle after code is released. This function includes incident mgmt, environment mgmt, and operational mgmt.

Functional Requirement Determination

In this phase, specific system functionalities are listed, and developers begin to think about how the parts should interoperate to meet the functional requirements. The deliverable is a function requirement document that lists specific system requirements. These requirements should be expressed in a form consumable by the software developers. The entire development team should constantly refer to this document during all phases to ensure that the project is on track. In the final stages of testing and evaluation. the project managers should use this document as a checklist to ensure that all functional requirements are met.

inference

Inference attacks involve combining several pieces of nonsensitive information to gain access to information that should be classified at a higher level. Inference makes use of the human mind's deductive capacity rather than the raw mathematical ability's of modern database platforms. As with aggregation, the best defense against inference attacks is to maintain constant vigilance over the permissions granted to individual users. Intentional blurring of data may be used to prevent the inference of sensitive information. Can also use database partitioning

Testing

Initially, most organizations perform the initial system testing using development personnel to seek out any obvious errors. As the testing progresses, developers and actual users validate the system against predefined scenarios that model common and unusual user activities. In cases, where the project is releasing updates to an existing system, regression testing formalizes the process of verifying that the new code performs in the same manner as the old code, other than any changes expected as part of the new release. These testing procedures should include both functional testing that verifies the software is working properly and security testing that verifies there are no unaddressed significant security issues. Once developers are satisfied that the code works properly, the process moves into user acceptance testing, where users verify that the code meets their requirements and formally accept it as ready to move into production use. Once this phase us complete, the code may move to deployment, As with any critical development process, it's important that you maintain a copy of the written test plan and test results for future review.

major criticism of Waterfall Model

It allows the developers to step back only one phase in the process. It does not make provisions for the discovery of errors at a later phase in the development cycle.

The stages of SW-CMM

Level 1: Initial - Usually little or defined software development process Level 2: Repeatable - Reuse of code in an organized fashion begins to enter the picture, and repeatable results are expected from similar projects. SEI defines the key process areas for this level as Requirements Management, Software Project Planning, Software Project Planning, Software Project Tracking and Oversight, Software Subcontract Management, Software Quality Assurance, and Software Configuration Management Level 3: Defined - In this phase software developers operate according to a set of formal, documented software development processes. All development projects take place within the constraints of the new standardized management model. SEI defines the key process areas for this level as Organization Process Focus, Organization Process Definition, Training Program, Integrated Software Management, Software Product Engineering, Intergroup Coordination, and Peer Reviews. Level 4: Managed - In this phase, management of the software process proceeds to the next level. Quantitative measures are used to gain a detailed understanding of the development process. SEI defines the key process areas for this level as Quantitative Process Management and Software Quality Management Level 5: Optimizing In the optimized organization, a process of continuous improvement occurs. Sophisticated software development process are in place that ensure that feedback from one phase reaches to the previous phase to improve future results. SEI defines the key process areas for this level as Defect Prevention, technology Change Management, and Process Change Management

What type of data to use during software testing?

Live workloads provide the best stress testing possible. However, you should not use live or actual field data for testing, especially in the early development stages, since a flaw or error could result in the violation of integrity or confidentiality of the test data., This process should involve the use of both use cases, which mirror normal activity of an attacker. Including both of these approaches helps testers understand how the code will perform under normal activity, and misuse cases, which attempt to model the activity of an attacker. Including both of these approaches helps testers understand how the code will perform under normal activity (including normal errors) and when subjected to the extreme conditions imposed by an attacker.

Databases that fail to implement concurrency correctly may suffer from the following

Lost Updates: Occurs when two different processes make updates to a database, unaware of each other's activity. For example, imagine an inventory database in a warehouse with different receiving stations. The warehouse might currently have 10 copies of the CISSP Study Guide in stock. If two different receiving stations each receive a copy of the CISSP Study Guide at the same time, they both might check the current inventory level, find that it is 10, increment it by , and update the table to read 11, when the actual value should be 12. Dirty Reads: Occur when a process reads a record from a transaction that did not success fully commit. Returning to our warehouse example, if a receiving station beings to write new inventory records to the database but then crashes in the middle of the update, it may leave partially incorrect information in the database if the transaction is not completely rolled back.

release control

Once the changes are finalized, they must be approved for release through the release control procedure. An essential step of the release control procedure. An essential step of the release control process to double-check and ensure that any code inserted as programming aid during the change process (such as debugging code and/or backdoors) is removed before releasing the new software to production. This process also ensure that only approved changes are made to production systems. Release control should also include acceptance testing to ensure that any alterations to end-use work tasks are understood and functional.

Change Control

Process used by developers to re-create the situation encountered by the user and to analyze the appropriate changes to remedy the situation, It also provides an organized framework within which multiple developers can create and test a solution prior to rolling it out into a production environment. This component includes conforming to quality control restrictions, developing tools for update or change deployment, properly documenting any coded changes, and restricting the effects of new code to minimize diminishment of security.

Fail-Secure

Puts the system into a high level of security (and possibly even disables it entirely) until an administrator can diagnose the problem and restore the system to normal operation. Software should revert to this condition. This may mean closing just the application or possibly stopping the operation of the entire host system. Once a fail-secure operation occurs, the programmer should consider the activities that occur afterward. The options are to remain in a fail-secure state or to automatically reboot the system. The former option requires an administrator manually reboot the system and oversee the process. This action can be enforced by using a boot password. The latter option does not require human intervention for the system to restore itself to a functioning state, but it has its own unique issues. For example, it must restrict the system reboot into a nonprivileged state. In other words. the system should not reboot and perform an automatic logon; instead it should prompt the user for authorized access credentials

ACID model

Relational database transactions have required characteristics: atomicity, consistency, isolation, and durability. Together, these attributes are known as the ACID model, which is a cirtical concept in the development of database management systems.

Database Transactions

Relational databases support the explicit and implicit use of transactions to ensure data integrity. Each transactions is a discrete set of SQL instructions that should either succeed. It's not possible for one part of a transaction to succeed while another part fails. or fail as a group. When a transactions successfully finishes, it is said to be committed to the database and cannot be undone. Transaction committing may be explicit, using SQL's COMMIT command, or it can be implicit if the end of the transaction is successfully reached. If a transactions must be aborted, it can be rolled back explicitly using the ROLLBACK command or implicitly if there is a hardware or software failure. When a transactions is rolled back, the database restores itself to the condition it was in before the transaction began. Any relational database must be able to pass the ACID test: it must guarantee atomicity, consistency, isolation and durability. Transaction begins with 1st DML statement and terminates with a control statement. Explicit transaction control statements are: COMMIT [or DDL-DCL statement] ROLLBACK [or exit, session end or system crash] SAVEPOINT [sets markers, but will not end a transaction] SELECT FOR UPDATE

SQL

SQL primary security feature is its granularity of authorization. This means that SQL allows you to set permissions at a very fine level of detail. You can limit user access by table, row, column, or even by individual SQL is divided into two distinct components: the Data Definition Language, which allows for the creation and modification of the database's structure(known as the schema), and the Data Manipulation Language(DML), which allows users to interact with the data contained within that schema

Aggregation attack

SQL provides a number of functions that combine records from one or more tables to produce potentially useful information. This is known as aggregation and aggregation attacks are used to collect numerous lower-level security items or low-value items and combine them to create something of a higher security level or value. A person or group may be able to collect multiple facts about or from a system and then use these facts to launch an attacks. Combining defense-in-depth, need-to-know, and least privilege principles helps prevent access aggregation attacks.

Scrum

Takes its name from the daily team meetings, called scrums, that are its hallmark, Everyday the team gets together for a short meeting where they discuss the contributions made by each team member, plan the next day's work, and work to clear any impediments to their progress. These meetings are led by , the project's scrum master, an individual in a project management rile who is responsible for helping the team move forward and meet their objectives.

IDEAL Model

The Software Engineering Institute also developed the IDEAL model for software development, which implements many of the SW-CMM attributes.

the best time to address software testing

The best time to address software testing is as the modules are designed. The mechanisms you use to test a product and the data sets you use to explore that product should be designed in a parallel with the product itself. Your programming team should develop special test suites of data that exercise all paths of the software to the fullest extent possible and know the correct resulting outputs beforehand.

Distributed Database

The distributed data model has data stored in more than one database, but those databases are logically connected. The user perceives the data base as a single entity , even though it consists of numerous parts interconnected over a network. Each field can have numerous children as well as numerous parents. Thus the data mapping relationship for distributed databases is many-to-many

error handling and possible implications

The in-depth information returned in error messages is crucial to debugging code and makes it easier for technical staff to diagnose problems experienced by users. However, those error messages may also expose sensitive internal information to attackers, including the structure of database tables, the addresses of internal servers, and other data that ma be useful in reconnaissance efforts that precede an attack. Therefore, developers should disable detailed error message on any servers and applications that are publicly accessible.

isolation

The isolation principle requires that transactions operate separately from each other. If a database receives two SQL transactions that modify the same data, one transaction must be completed in its entirety before the other transaction is allowed to modify the same data. This prevents one transaction from working with invalid data generated as an intermediate step by another transaction.

logging

While user-facing detailed error messages may present a security threat, the information that those messages contain is quite useful, not only to developers bit also to cybersecurity analysts. Therefore, applications should be configured to send detailed logging of errors and other security events to a centralized log repository

Separation of Duties while testing Software

You should assign the testing of your software to someone other than the programmer(s) who developed the code to avoid a conflict of interest and assure a more secure and functional finished product, When a third party tests your software, you have a greater likelihood of receiving an objective and nonbiased examination, The third-party test allows for a broader and more thorough test and prevents the bias and inclinations of the programmers from affecting the results of the tests.

class

a collection of the common methods from a set of objects that defines the behavior of those objects.

message

a communication to or input of an object

ODBC (Open Database Connectivity)

a database feature that allows applications to communicate with different types of databases without having to be directly programmed for interaction with each type. ODBC acts as a proxy between applications and back-end database drivers, giving application programmers greater freedom in creating solutions without having to worry about the back-end database system.

Gray-Box Testing

combines the two approaches and is popular for software validation. In this approach, testers examine the software from a user perspective, analyzing inputs and outputs. They also have access to the source code and use it two help design their tests. They do not, however, analyze the inner workings of the program during their testing.

cohesion

describes the strength if the relationship between the purposes of the methods within the same class. When all the methods have similar purposes, there is high cohesion, a desirable condition that promotes good software design principles. When the methods of a class have low cohesion, this a sign that the system is not well designed.

polyinstantiation

occurs when two or more rows in the same relational database table appear to have identical primary key elements but contain different data for use at differing classification levels. This mechanism is often used as a defense against some types of inference attacks, but it introduces additional storage costs to store copies of data designed for different clearance levels.

escaping input

performed by replacing occurrences of sensitive characters with alternative code that will render the same to the end user but not executed by the system

Key/value stores

perhaps the simplest possible form of database. They store information in key/value pairs, where the key is essentially an index used to uniquely identify a record which consists of a data value. Key/value stores are useful for high-speed applications and very large data sets where the rigid structure of relational model would require significant , and perhaps unnecessary overhead.

Commercial off-the-shelf (COTS)

purchased to run on servers managed by the organization, either on premises or in an IaaS environment Whenever a company acquires any type of software, be it COTS or OSS, run on-premises or in the cloud, that software should be tested for security vulnerabilities. Organizations may conduct their own testing, rely on the results of tests provided by vendors, and/or hire third parties to conduct independent testing.

limit check

the type of input validation, where the code checks ensure that number falls within an acceptable range

sprints

well defined periods of time, typically between one and four weeks, where the team focuses on achieving short-term objectives that contribute to the broader goals of the project. At the beginning of each sprint, the team gathers to plan the work that will be conducted during each sprint. At the beginning each spring, the team gathers to plan the work that will be conducted during each sprint. At the end of the sprint, the team should have a fully functioning product that could be released, even if it does not yet meet all requirements. Each subsequent sprint introduces new functionality into the product

The IDEAL model has five phases

1. Initiating - the business reasons behind the change are outlined, support is built for the initiative, and the appropriate infrastructure is put in place. 2. Diagnosing - During the diagnosing phase, engineers analyze the current state of the organization and make general recommendations for change 3. Establishing - The organization takes the general recommendations from the diagnosing phase and develops a specific plan of action that helps achieve those changes. 4. Acting - It's time to stop "talking the talk" and "walk the walk". The organization develops solutions and then tests, refines, and implements them. 5, Learning - As with any quality improvement process, the organization must continuously analyze its efforts to determine whether it has achieved the desired goals, and when necessary, propose new actions to put the organization back on course.

relational database

A database that represents data as a collection of tables in which all data relationships are represented by common values in related tables. The row and column structure provides for one-to-one mapping relationships. The main building block of the relational database is the TABLE. Each table contains a number of attributes, or fields. Each attribute corresponds to a column in the table. Each customer would have their own record, or tuple, represented by a row in the table. Relationships between tables are defined to identify related records. Records are identified using a variety of keys.

neural networks

Chains of computational units are used in an attempt to imitate the biological reasoning process of the human mind. In an expert system, a series of rules is stored in a knowledge base, whereas in a neural network, a long chain of computational decisions that feed into each other and eventually sum to produce the desired output is set up. Typical neural networks involve many layers of summation, each of which requires weighting information to reflect the relative importance of the calculation in the overall decision-making process. the weights must be custom-tailored for each type of decision the neural network is expected to make. In the learning period, the network is provided with inputs for which the proper decision is known. The algorithm then works backward from these decisions to determine the proper weights for each node in the computational chain. This activity is performed using what is known as the Delta rule

Risks of using APIS

Fisrt, developers must consider authentication requirements. Some APIS, such as those that allow checking weather forecasts or product inventory, may be available to the general public and not require any authentication. API developer must know when to require authentication and ensure that they verify credentials and authorization for every API call. This authentication is typically done by providing authorized API users with a complex API key that is passed with each API call. APIS must also be tested thoroughly for security flaws.

Three major characteristics of a functional requirement

Input(s) - the data provided to a function Behavior - the business logic describing what actions the system should take in response to different inputs Output(s) - the data provided from a function

feedback loop characteristic

The ability in the modern waterfall model that allows development to return to the previous phase to correct defects discovered during the subsequent phase.

Waterfall Model

This model was one of the first comprehensive attempts to model the software development process while taking into account the necessity of returning to the previous phases to correct system faults. The original design was intended be sequential steps from inception to conclusion. In practical application, the waterfall model, of necessity evolved to a more modern model. System Requirements Software Requirements Preliminary Design Detailed Design Code and Debug Testing Operations and Maintenance

Normalization

The process of applying rules to a database design to ensure that information is divided into the appropriate tables. 1NF, 2NF, 3NF all add requirements to reduce redundancy in the tables, eliminate misplaced data, and perform a number of other housekeeping tasks. The normal forms are cumulative - in other words, to be in 2NF, a table must first be 1NF compliant. Before making a table 3NF compliant, it must first be in 2NF.

Request Control

The request control process provides an organized framework within which users can request modifications, manages can conduct cost/benefit analysis, and developers can prioritize tasks.

hierarchical Databases

This model combines records and fields that related in a logical tree structure. This results in a one-to-many data model, where each node may have zero, one or many children but only one parent. Examples of this model include the one-to-many, the NCAA March Madness bracket system and the hierarchical distribution of Domain Name System (DNS) records used on the internet. Hierarchical databases store data in this type of hierarchical fashion and are useful for specialized applications that fit the model. For example, biologists might use a hierarchical database to store data on specimens according to the kingdom/phylum/class/order/family/genus/species hierarchical model used in that field.

conceptual definition

This phase involves creating a simple statement agreed on by all interested stakeholders that states the purpose pf the project as well as the general system requirements. It is very high level statement of purpose and should not be longer than one or two paragraphs. Works similar to an abstract or introduction that enables an outsider to gain a top-level understanding of the project in a short period of time. Security requirements at this phase are generally very high level and will be refined during the control specifications development phase. At this point in the process, designers commonly identify the classification(s0 of data that will be processed by the system and the applicable handling requirements. It is helpful to refer to the concept statement at all phases of the systems development process. Often, the intricate details of the development process tend to obscure the overarching goal of the project. Simply reading the concept statement periodically can assist in refocusing a team of developers.

Foreign Key

This type of key is used to enforce relationships between two tables, also known as referential integrity. Referential integrity ensures that if one table contains a foreign ley, it corresponds to a still-existing primary key in the other table in the relationship. It makes certain that no record/tuple/row contains a reference to a primary key of a non existent record/tuple/row.

Software Configuration Management

Used to control the version(s0 of software used throughout an organization and to formally track and control changes to the software configuration. Configuration Identification - During this process, administrators document the configuration of covered software products throughout the organization Configuration Control - This process ensures that changes to software versions are made in accordance with the change control and configuration management policies. Configuration Status Accounting - Formalized procedures are sued to keep track of all authorized changes that take place Configuration Audit - A periodic configuration audit should be conducted to ensure that the actual production environment is consistent with accounting records and that no unauthorized configuration changes have taken place.

restricting access with Views

Views are simply SQL statements that present data to the user as if the views were tables themselves. View may be used to collate data from multiple tables, aggregate individual records, or restrict a user's access to a limited subset if database attributes and/or records. Views are stored in the database as SQL commands rather than tables of data. This dramatically reduces the space requirements of the database and allows views to violate the rules of normalization that apply to tables. However, retrieving data from a complex view can take significantly longer than retrieving it from a table because the DBMS may need to perform calculations to determine the value of certain attributes for each record. Because views are so flexile, many database administrators use them as a security tool -- allowing users to interact with only limited views rather than with the raw tables of data underlying them.

Program Evaluation and Review Technique (PERT)

a project-scheduling tool used to judge the size of a software product in development and calculate the standard deviation for risks assessment. PERT relates the estimated lowest possible size, the most likely size, and the highest possible size of each component. The PERT chart clearly shows the dependencies to better manage the time of team members and perform task scheduling. PERT is used to direct improvements to project management and software coding in order to produce more efficient software. AS the capabilities of programming and management improve, the actual produced size of software should be smaller.

Application Programming Interface (API)

a set of routines, protocols, and tools for building software applications. APIs allow application developers to bypass traditional web pages and interact directly with the underlying service though function calls.

Gantt Chart

a type of bar chart that shows the interrelationships over time between projects and schedules. It provides a graphical illustration of a schedule that helps you plan, coordinate, and track specific tasks in a project. They are particularly useful when coordinating tasks that require the use of the same team members of other resources.

method

an internal code that defines the actions an object performs in response to a message

Software Assurance Maturity Model (SAMM)

an open source project maintained by OWASP. It seeks to provide a framework for integrating security activities into the software development and maintenance process and to offer organizations the ability to assess their maturity

curl

an open source tool available for major operating systems that allows users to directly access websites without the use of a browser. For this reason, curl is commonly used for API testing and also for potential API exploits by an attacker. Can be used to post requests to an API.

Alternate Key

any candidate key that is not selected as the primary key is referred to this type of key. For example, if the telephone number us unique to a customer in, the Telephone could be considered a candidate key. Since Company ID was selected as the primary key, then Telephone is an alternate key.

instance

objects are instances of or examples of classes that contain their methods

document stores

similar to key/value stores in that they store information using keys, but the type of information they store is typically more complex than that in a key/value store and is in the form of a document. Common document types used in document stores include XML and JSON.

graph databases

store data in graph format, using nodes to represent objects and edges to represent relationships. They are useful for representing any type of network, such as social networks, geographic locations, and other datasets that lend themselves to graph representations.

polymorphism

the characteristic of an object that allows it to respond with different behaviors to the same message or method because of changes in external conditions

delegation

the forwarding of a request by an object to another object or delegate. An object delegates if it does have a method to handle the message

Coupling

the level of interaction between objects. Lower coupling means less interaction. Lower coupling provides better software design because objects are more independent, Lower coupling is easier to troubleshoot and update. Objects that have low cohesion require lots of assistance from other objects to perform tasks and have high coupling.


संबंधित स्टडी सेट्स

Bio-Chapter17-Speciation and Macroevolution

View Set

Vocab - Reading - 1st set of 10 words

View Set

MATH 1680 - Section 6.1 - Discrete Random Variables

View Set

Chapter 8: Operations Management Part 2 - Six Sigma and TQM

View Set