CISSP - 2) Asset Security Domain

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

Data Documentation Practices

Dataset Titles and File names: - should be descriptive as these datasets may be accessed many years into the future, by people who will be unaware of the details of the project or program - File contents - they must understand the contents of the dataset, including the parameter names, units of measure, formats, and definitions of coded values. - Metadata - defined as data about data, provides information on the identification, quality, spatial context, data attributes, and distribution of datasets, using common terminology and sets of definitions that prevent loss of the original meaning and value of the resource.

SSD's

Differ from magnetic memory in that flash memory cannot be overwritten. The flash translation layer controls how the system is able to access the data, it can effectively "hide" data from data destruction software, leaving iterations of the data un-erased, on different sections of the drive. SSD manufactures include built-in sanitization commands that are designed to internally erase the data on the drive. Cryptographic erasure or crypto-erase, takes advantage of the SSD's built-in data encryption. Most SSD's encrypt data by default. Buy erasing the encryption key, the data will be unreadable. Ideally, a combination of all these techniques: crpto-erase, sanitization, and targeted overwrite passes. SSDs require the careful attention of a committed data destruction expert who can tune these erasure techniques to effectively prevent data remanence on SSDs.

Cloud-based Data Remanence

Due to the fact that a third party owns and operates the system, and the enterprise is effectively renting storage space, there is little to no visibility into the management and security of the data in many cases. One way is to ensure that no unencrypted data every gets written to any cloud providers media and then loosing the key would effectively make the data unavailable.

Key areas of focus

Focusing in three distinct areas, media, hardware, and personnel, you can ensure retention is being addressed in a formal manner, aligned with the policies of the enterprise, and meant to ensure CIA.

Data Life Cycle Control

Good data management requires the whole life cycle of data to be managed. - Data separation and modeling processing and database maintenance and security - Ongoing data audit, to monitor the use and continued effectiveness of existing data - Archiving, to ensure data is maintained effectively, including periodic snapshots to allow rolling back to previous versions in the event that primary copies and backups are corrupted.

Record retention

Information and data should be kept only as long as it is required. A common mistake in records retention is finding the longest retention period and applying it to all types of information in an organization without analysis.

Data Publishing

Information publishing and access need to be addressed when implementing integrated information management solutions.

Asset Management

Inventory management is about capturing the basics of what assets are on hand, where they reside, and who owns them. Configuration management adds a relationship dynamic, such that you can associate each item with other items in the inventory. Classes and components, upstream and downstream, and parent/child relationships establish relationships between each configuration item (CI).

Data Modeling

Is the methodology that identifies the path to meet user requirements. Data modeling should be iterative and interactive.

Baseline Catalogues

May specify safeguards to be used in detail or they may suggest a set of security requirements to be addressed with whatever safeguards appropriate to the system under consideration. One of the objectives of the baseline approach is consistency of security safeguards throughout the enterprise. - International and national standards organizations - Industry sector standards or recommendations - Some other company, preferably with similar business objectives and of comparable size. Generating its own baseline, established commensurate with its typical environment and its business objectives: - Only a minimum amount of resources is needed for risk analysis and management for each safeguard implementation, and thus less time is spent on selecting security safeguards - Baseline safeguards may offer a cost-effective solution

Improving Data Quality

Principles of data quality need to be applied at all stages of the data management process (capture, digitization, storage, analysis, presentation and use). There are two keys: - prevention - correction Data documentation is critical for ensuring that datasets are usable well into the future. Data longevity is roughly proportional to the comprehensiveness of their documentation. - Ensure the longevity of data and their re-use for multiple purposes - Ensure that data users understand and content context and limitations of datasets - Facilitate the discovery of datasets - Facilitate the interoperability of datasets and data exchange.

Hessisches Datenschutzgesetz 9/30/70

Purpose was to protect all digitized material of public agencies within their responsibilities against disclosure, misuse, alteration, or deletion by civil servants. Which called for a Data protection officer

Data Processes

Quality as applied to data has been defined as fitness for use or potential use.

Safe Harbor Program

The EU prohibits the transfer of personal data to non-EU countries that do not meet the EU "adequacy" standard for privacy protection. The US takes a different approach to privacy. Safe Harbor provides a bridge between US and EU countries

Determine and maintain ownership

The individual who owns the data should decide the classification under which the data falls. It should be reviewed annually.

Information / Data Owner

The information owner typically has the following responsibilities: - Determine the impact the information has on the mission of the organization - Understand the replacement cost of the information (if it can be replaced - Determine who in the organization or outside of it has a need for information and under what circumstances the information should be released - Know when the information is inaccurate or no longer needed and should be destroyed. Data owners generally have legal rights over the data, along with copyright and intellectual property rights.

Data Owners

The objectives of delineating data management roles and responsibilities are to: - Clearly define roles associated with functions - Establish data ownership throughout all phases of a project - Instill data accountability - Ensure that adequate, agreed upon data quality and metadata metrics are maintained on a continuous basis. Information has a life that consists of creation, use and finally destruction.

Categorization

the process of determining the impact of the loss of CIA of the information to an organization.

Improving Critical Infrastructure Cybersecurity Framework

- Describe their current cybersecurity posture - Describe their target state for cybersecurity - Identify and prioritize opportunities for improvement within the context of a continuous and repeatable process - Assess progress toward the target state - Communicate among internal and external stakeholders about cybersecurity risk. Is a risk-based approach to managing cybersecurity risk. - Framework Core: set of cybersecurity activities, desired outcomes, and applicable references that are common across critical infrastructure sectors. "Identify, Protect, Detect, Respond, Recover" - Framework Implementation Tiers: These tiers reflect a progression from informal reactive responses to approaches that are agile and risk informed - Framework Profiles: represents the outcomes based on business needs that an organization has selected from the Framework categories and subcategories.

Current list of Critical Security Controls - v5

- Inventory of Authorized and Unauthorized devices - Inventory of Authorized and Unauthorized software - Secure Configurations for Hardware and Software on Mobile Devices, Laptops, Workstations, and Servers - Continuous Vulnerability Assessment and Remediation - Malware Defenses - Application Software Security - Wireless Access Control - Data Recovery Capability - Security Skills Assessment and Appropriate Training to Fill Gaps - Secure Configuration for network Devices such as Firewalls, Routers, and Switches. - Limitation of Control of Network Ports, Protocols and Services - Controlled Use of Administrative Privileges - Boundary Defense - Maintenance, Monitoring, and Analysis of Audit Logs - Controlled Access Based on Need to Know - Account Monitoring and Control - Data Protection - Incident Response and Management - Security Network Engineering - Penetration Tests and Red Team Exercises.

Data in Transit

- Modern cryptography provides secure and confidential methods to transmit data and allows the verification of the integrity of the message, so that any changes to the message itself can be detected. - Advances in Quantum Cryptography theorize the detection of whether a message has ever been read in transit.

Data Classification Policy

- Who will have access to data. define the roles of people who can access the data. - How the data is secured - determine whether the data is generally available or, by default, off limits. - How long the data is to be retained. if regulatory requirements do not exist, they should base the retention period on the needs of the business - What method(s) should be used to dispose of the data - Whether the data needs to be encrypted. Data owners will have to decide whether their data needs to be encrypted. - Appropriate use of the data. This aspect of the policy defines whether data is for use within the company, is restricted for use by only selected roles, or can be made public to anyone outside the organization.

Organization for Economic Cooperation and Development's 1980 Guidelines Governing the Protection of Privacy and Transborder Data Flows of Personal Data.

All require: - Data to be obtained fairly and lawfully - Used only for the original specified purpose - Adequate, relevant, and not excessive to purpose - Accurate and up to date - Accessible to the subject - Kept secure - Destroyed after its purpose is completed. EU strengthened data protection - gals were to protect individuals data in all policy areas including law enforcement, while reducing red tape for business and guaranteeing the free circulation of data with in the EU - Strengthening individuals rights so that the collection and use of personal data is limited to the minimum necessary, - Enhancing the Single Market dimension by reducing the administrative burden on companies and ensuring a true level playing field - Revising data protection rules in the area of police and criminal justice so that individuals personal data is also protected in these areas. - Ensuring high levels of protection for data transferred outside the EU by improving and streamlining procedures for international data transfers - More effective enforcement of the rules, by strengthening and further harmonizing the role an powers of Data Protection Authorities.

EU Court Ruling

Anyone - people living in Europe and potentially those living outside the region may ask search engines to remove links to online information if they believe the links breach their right to privacy. - Even if the physical server is located outside Europe. - search engines are controllers of personal data - right to be forgotten: this applies where the information is inaccurate, inadequate, irrelevant, or excessive for the purposes of the data processing.

Data Custodian's

Are established to ensure important datasets are developed, maintained, and accessible within their defined specifications. - Adherence to appropriate and relevant data policy and data ownership guidelines - Ensuring accessibility to appropriate users, maintaining appropriate levels of dataset security - Fundamental dataset maintenance, including but not limited to data storage and archiving - Dataset documentation, including updates to documentation - Assurance of quality and validation of any additions to a dataset, including periodic audits to assure ongoing data integrity.

Classification and Categorization Systems

Canada's "Security of Information Act" China's Law on "Guarding State Secrets" The United Kingdom's "Official Secrets Acts" NIST Federal Information Processing Standard (FIPS) 199 & SP800-60 "Guide for mapping Types of Information and Information Systems to Security Categories". -Is used to help standardize the defense baselines for information systems and the level of suitability and trust an employee may need to access information.

Standards Selection

Center for Strategic & International Studies (CSIS) 20 Critical Security Control Initiative. - Offense informs defense - Use knowledge of actual attacks that have compromised systems to provide the foundation to build effective, practical defenses. - Prioritization - Invest first in controls that will provide the greatest risk reduction and protection against the most dangerous threat actors that can be feasibly implemented in your computing environment. - Metrics - establish common metrics to provide a shared language for executives, IT specialists, auditors, and security officials to measure the effectiveness of security measures within an organization so that required adjustments can be identified and implemented quickly. - Continuous Monitoring - carry out continuous monitoring to test and validate the effectiveness of current security measures. - Automation - automate defenses so that organizations can achieve reliable, scalable, and continuous measurements of their adherence to the controls and related metrics.

Link Encryption

Data are encrypted on a network using either link or end-to-end encryption. Link encryption encrypts all of the data along a communications path. - The data packet is decrypted and re-encrypted at each point in the communications channel (to enable routing) - Because it also encrypts routing information, it provides traffic confidentiality better than end-to-end encryption. End-to-End is generally performed by the end user within an organization. Data remains encrypted when passing through the network, routing information remains visible. Recommendations: - covered data must be encrypted when transmitted across any network to protect against eavesdropping of network traffic by unauthorized users. - email is not considered secure and must not be used to transmit covered data unless additional e-mail encryption tools are used. - where the covered device is reachable via the web, it must be transmitted over SSL, using only strong security protocols such as SSLv3 and TLS v1.1 or v1.2 - Covered data transmitted over e-mail must be secured using cryptographically strong e-mail encryption tools such as PGP or S/MIME. - Non-web covered data should be encrypted via application-level encryption - all connections between an application and the database should be encrypted using FIPS-compliant cryptographic algorithms. - where application-level encryption is not available, implement network-level encryption such as IPSec or SSH tunneling. - Encryption should be applied when transmitting covered data between devices in protected subnets with strong firewall controls.

Cryptography

Data at rest: protection of stored data is often a key requirement for an organization's sensitive information. This is done through the use of cryptographic algorithms that limit access to the data to those that hold the proper encryption keys. Data at rest recommendations: - Develop and test an appropriate Data Recovery Plan - Use compliant encryption algorithm and tools - When ever possible, use AES for the encryption algorithm because of its strength and speed. - When creating a password, follow strong password requirements. Do not use the same password from other systems. - contain 9 characters or more - contain characters from two of the three classes - Alphabetic - Numeric - Punctuation and other characters - use a secure password management tool to store sensitive information such as passwords and recovery keys. - Send passwords separately from encrypted files - Do not write down the password and store it at the same location as the storage media.

Equipment Life Cycle

Define Requirements: - Ensure relevant security requirements are included in any specification for new equipment. - Ensure appropriate costs have been allocated for security features required. - Ensure new equipment requirements fit into the organizational security architecture. Acquire and Implement: - Validate security features and included as specified - Ensure additional security configurations, software, and features are applied to the equipment. - Ensure the equipment is followed thorough any security certification or accreditation process as required. Ensure the equipment is inventoried. Operations and Maintenance: - Ensure the security features and configuration remain operational - Review the equipment for vulnerabilities and mitigate if discovered. - Ensure appropriate support is available for security-related concerns - Validate and verify inventories to ensure equipment is in place as intended. - Ensure changes to the configuration of the system are reviewed through a security impact analysis and vulnerabilities are mitigated. Disposal and Decommission: - Ensure equipment is securely erased and then either destroyed or recycled depending on the security requirements of the organization. - Ensure inventories are accurately updated to reflect the status of decommissioned equipment.

Data Standards

Describe objects, features, or items that are collected, automated, or affected by activities or the functions of organizations. Benefits: - More efficient data management (including updates and security) - Increased data sharing - Higher quality data - Improved data consistency - Increased data integration - Better understanding of data - Improved documentation of information resources. Considerations: - International - National - Regional - Local

Picking Encryption Algorithms

For the same encryption algorithm, a longer encryption key length generally provides stronger protection Long complex passphrases are stronger than shorter passphrases - must contain 15 characters or more - contain characters from 2 of the 3 categories alphabetic numeric punctuation and other characters Strong encryption generally consumes more CUP resources than weak encryption.

Classify information and supporting assets

General guideline is that the definition of the classification should be clear enough so that it is easy to determine how to classify the data. Private - data such as SSN's, Bank accounts, or CC info. Company Restricted - data that is restricted to a subset of employees Company Confidential - can be viewed by all employees but is not for general use Public - can be viewed or used by employees of the general public.

Data Remanence

Is the residual physical representation of data that has been in some way erased. - Remanence on HDD is caused by the failure of the methods/mechanisms used to "clean" the HDD when it is time to remove the current data. - Cleaning is the removal of sensitive data from storage devices in such a way that there is assurance that the data may not be reconstructed using normal system functions or software file/data recovery utilities. - Purging or sanitizing is the removal of sensitive data from a system or storage device with the intent that the data cannot be reconstructed by any know technique. - Destruction - media is made unusable for conventional equipment. Destruction using appropriate techniques is the most secure method of preventing retrieval. Methods: - Overwriting - a common method used to counter data remanence is to overwrite the storage media with new data. Also known as wiping or shredding the file or disk. One challenge with overwrites is that some areas of the disk may be inaccessible, due to media degradation or other errors. - Degaussing - may be accomplished in two ways, in AC erasure, the medium is degaussed by applying an alternating field that is reduced in amplitude over time from an initial high value. DC erasure, the medium is saturated by applying a unidirectional field. Degaussing renders the media unusable and requires a low-level format to become usable again. With modern drives this is not possible without manufacturer specific and often model-specific service equipment. - Encryption - data before it is stored on the media can mitigate concerns about data remanence. If the key was stored on the media, it can be overwritten very quickly. This called a cryptoerase

Handling media

Only designated personnel should have access to sensitive media.

Software Licensing

Original copies of licensed software must be controlled by the organization to prevent copyright infringement.

Marking Media

Polices need to be created for marking media. - Storage media should have a physical label identify the sensitivity of the information contained - Label should clearly indicate if the media is encrypted - label may also contain information regarding a point of contact and a retention period. - When media is discovered without a label, it should be immediately at the highest level of sensitivity until it can be reviewed and relabeled if necessary.

Quality Control and Quality Assurance

Quality Control is an assessment of quality based on internal standards, processes, and procedures established to control and monitor quality. Quality Assurance is an assessment of quality based on standards external to the process and involves reviewing of the activities and quality control processes to ensure final products meet predetermined standards of quality. QC/QA mechanisms are designed to prevent data contamination, which occurs when a process or event introduces either of two fundamental types of errors into a dataset. - Errors of commission include those caused by data entry or transcription, or by malfunctioning equipment. - Errors of omission often include insufficient documentation of legitimate data values, which could affect the interpretation of those values. Data quality is assessed by applying verification and validation procedures as part of the quality control process. US Environmental protection Agency defines data verification as the process of evaluating the completeness, correctness, and compliance of a dataset with required procedures to ensure that the data is what is purports to be.

Compliant Encryption Tools

Self-Encrypting USB Drives: USB drives that embed encryption algorithm within the hard drive, thus eliminating the need to install any encryption software Media Encryption Software: Software that is used to encrypt otherwise unprotected storage media (i.e., CD, DVD, USB, or laptop HD). - File Encryption Software: Allows greater flexibility in applying encryption to specific files.

Data Access, Sharing and Dissemination

Should be readily accessible to those who need them or those who are given permission to access them.

Media

Storing sensitive information requires physical and logical controls. Media lacks the means for digital accountability when the data is not encrypted. - physical and logical controls such as marking, handling, storing, and declassification provide methods for the secure handling of sensitive media.

Database Management

Technological obsolescence is a significant cause of information loss, and data can quickly become inaccessible to users if stored in out-of-date software formats or on outmoded media. Requires ongoing data audit to monitor the use and continued effectiveness of existing data. - Identifying the information needs of an organization/program and assigning a level of strategic importance to those needs. - Identifying the resources and services currently provided to meet those needs mapping information flows within an organization. analyzing gaps, duplications, inefficiencies, and areas of over-provision that enable the identification of where changes are necessary.

Baselines

The 1st layer of a defense-in-depth approach is the enforcement of the fundamental elements of network security. These elements form a security baseline, creating a strong foundation on which more advance methods and techniques can subsequently be built. - Which parts of the enterprise or systems can be protected by the same baseline. - Should the same baseline be applied throughout the whole enterprise - At what security level should the baseline aim - How will the controls forming the baseline(s) be determined. The objective of baseline protection is to establish a minimum set of safeguards to protect all or some of the IT systems of the enterprise.

Classification

The purpose of a classification system is to ensure information is marked in such a way that only those with an appropriate level of clearance can have access to the information.

Scoping and Tailoring

The use of scoping and tailoring to properly narrow the focus of the architecture will ensure that the appropriate risks are identified and addressed. Supplementaiton will allow the architecture to stay flexible over time and grow to address the needs of the enterprise that arise during operation of the architecture, once it is implemented fully.

Media Destruction

Through destruction of the underlying storage media is the most certain way to counter data remanence. - Physically breaking the media apart - Chemically altering the media into a non-readable, non-reverse=constructable state - Phase transition (i.e., liquefaction or vaporization of a solid disk) - For magnetic media, raising its temperature about the Curie temperature.

Establishing information governance and DB archiving

Understand where the data exists: Classify and define data: Archive and manage data See pg 148 for reference to governance sample - Promote cross-funtional ownership - Promoting a cross-functional ownership for archiving, retention, and disposal policies provides a great indicator of project success because all groups have a vested interest in a positive outcome. - Plan and practice data retention and orderly disposal. After all stakeholders have signed off on archiving and data retention policies, IT can develop a plan to implement those policies.

Storing media

Whenever possible, backup media should be encrypted and stored in a security container, such as a safe or strong box with limited access. - store off-site for DR purposes - when on site, store in fire-resistant box - job rotation and separation of duties should be implemented when it is cost effective to do so.

Data Storage and Archiving

address those aspects of data management related to the housing of data. - Server Hardware and Software - Network Infrastructure - Database maintenance and updating - Database Backup and Recovery Requirements - Archiving of Data Should be a priority data management issue. - Snapshots of data should be maintained so that rollback is possible in the event of corruption of the primary copy and backups of that copy.

Data Policy

defines strategic long-term goals for data management across all aspects of a project or enterprise. - Is a set of high-level principles that establish a guiding framework for data management - Can be used to address strategic issues such as data access, relevant legal matters, data stewardship issues and custodial duties, data acquisition and other issues - Should be flexible and dynamic because it provides a high-level framework. Considerations: - Cost of providing data versa the cost of providing access to the data - Ownership and Custodianship - ownership should be clearly addressed. - Privacy - clarification of what data is private and what data is to be made available in the public domain - Liability - involves how protected an organization is from legal recourse. It is often dealt with via end-user agreements and licenses. - Existing Law & Policy Requirements - Policy and Process - consideration should be given to legal requests for data and policies that may need to be put in place.

Data Classification

entails analyzing the data that the organization retains, determining its importance and value, and then assigning it to a category.

IT Asset Management

is a broader discipline, adding several dimensions of management, including cost, value, contractual status. It refers to the full life cycle management of IT Assets, from acquisition through disposition, which together account for a comprehensive expected state. ITAM is designed to manage the physical, contractual and financial aspects of those assets.

CMDB

is a logical entity with key integration points, and it supports and enables processes in service delivery, service support, IT asset management and other IT disciplines. - A single centralized and relational repository - implicit in configuration and IT asset management are the relational attributes of assets to components, contracts, operational status, financial impact and upstream/downstream relationships. - Organizational alignment and defined processes. - Scalable technologies and infrastructure

Security Content Automation Protocol (SCAP)

is a suite of specifications that standardize the format and nomenclature by which software flaw and security configuration information is communicated, both to machines and humans. its a multi-purpose framework of specifications that supports automated configuration, vulnerability and patch checking, technical control compliance activities, and security measurement. - Languages: are Extensible Configuration Checklist Description Format (XCCDF), Open Vulnerability and Assessment Language (OVAL), and Open Checklist Interactive Language (OCIL) - Reporting Formats - formats provide the necessary constructs to express collected information in standardized formats. Asset Reporting Format (ARF), and Asset Identification. - Enumerations: defines a standard nomenclature (naming format) and an official dictionary or list of items expressed using that nomenclature. - Measurement and Scoring Systems: refers to evaluating specific characteristics of a security weakness and based on those characteristics, generating a score that relects their relative severity. - Integrity: integrity specification helps to preserve the integrity of SCAP content and results. SCAP utilizes software flaw and security configuration standard reference data. The U.S. Federal Government, in cooperation with academia and private industry is adopting SCAP and encourages its use in support of security automation activities and initiatives.

Destroying media

media that is no longer needed or is defective should be destroyed rather than simply disposed of. a record of the destruction should be used that corresponds to any logs used for handling media. Implement object reuse controls for any media in question when the sensitivity is unknown.

Conceptual Design

phase of the database life cycle should produce an informational/data model. - consists of written documentation of concepts to be stored in the database, their relationships to each other, and a diagram showing those concepts and their relationships.


संबंधित स्टडी सेट्स

Descriptive Statistics (Continued)

View Set

Lección 6 Contextos 2 - Escoger

View Set

15 Common Mistakes Brazilians Make in English

View Set

Chapter 11 Differential Analysis: The Key to Decision Making

View Set

CIS140 Practice Questions 10, 11, 12

View Set

Chapter 49: Assessment and Management of Patients With Hepatic Disorders NCLEX

View Set

CFA Level 2 2016 - Quant: Multiple Regression & Analysis issues

View Set