Cloud Data Security Domain A

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

Which of the following areas is NOT part of the CCM framework and represented as a domain? A. Mobile security B. Human resources C. Governance D. Financial audit

Financial audit Explanation: The Cloud Control Matrix (CCM) serves as a framework for cloud security and covers the areas that are pertinent to it. Although the matrix contains main domains and areas that are directly applicable to data security and cloud data systems, financial audit is not included as one of them.

Which of the following pieces of data about an individual would be considered a direct identifier? A. Job title B. Educational history C. Income D. Phone number

Phone number Explanation: A phone number is considered a direct identifier because it is unique to a residence or individual. With a phone number as a sole piece of information, a specific individual can be readily and quickly identified, meeting the precise definition of a direct identifier.

Although content analysis is the least efficient and slowest of the available data discovery methods, which of the following aspects of the data make discovery the most challenging? A. Size B. Throughput C. Quality D. Source

Quality Explanation: With any content analysis, the quality of the data poses the biggest challenge to data discovery. Content can be freeform and may come in a variety of formats, with little to no consistency. Variations in spelling and word use can also make content analysis very challenging because there is no standardization to build concrete discovery rules against.

Which of the following will always serve as the starting point for the minimum period of data retention? A. Contract B. Regulation C. System resources D. Company policy

Regulation Explanation: Regulation will dictate, based on the type and content of data, the minimum period of data retention, as well as what events must be captured for retention. Regulation is specific to the type of data and the jurisdiction where it is located or consumed.

Encryption solutions can be embedded within database operations that will serve to protect data in a manner that is not noticeable to the user. What kind of encryption strategy is this? A. Transparent B. Passive C. RSA D. Homomorphic

Transparent Explanation: Transparent encryption is used within a database to protect data as it is being stored and processed, but it's done as an integrated database function and is not something that needs input from the application. The application doesn't even need to be aware it is being done. Transparent encryption is an effective tool because applications do not need to be rewritten or modified to handle encryption activities.

Which type of masking would be appropriate for the creation of data sets for testing purposes, where the same structure and size are of importance? A. Dynamic B. Structured C. Tokenized D. Static

Static Explanation: Static masking is used to produce full data sets, with sensitive data removed, often for the purposes of testing or development, where the structure and size will need to resemble production data, just without the actual sensitive values.

Which of the following data protection methodologies maintains the ability to connect back values to the original values? A. Tokenization B. Anonymization C. Obfuscation D. Dynamic mapping

Tokenization Explanation: Tokenization replaces data fields with opaque handlers or descriptors. In the process, the original and sensitive information is removed from the data, and the token values can be mapped them back to the original values if needed.

When a DLP solution is used to protect data in transit, where is the optimal place to deploy the DLP components? A. On the server originating the traffic B. At the network perimeter C. Between VLANs D. On the server receiving the data

At the network perimeter Explanation: When protecting data in transit, a DLP solution would optimally be deployed at the network perimeter. Within a network, extensive monitoring can be used in conjunction with other security controls on the data, but the network perimeter represents the last hop before leaving the application for external use, and would be the appropriate place to have a DLP implementation.

You are working as a forensic investigator and collecting information on a potential system breach by a malicious insider. Which of the following is essential for you in order to ensure evidence is preserved and admissible? A. Confidentiality B. Privacy C. Chain of custody D. Aggregation

Chain of custody Explanation: In order for evidence to be considered properly preserved and admissible for legal proceedings, the chain of custody is essential. The chain of custody catalogs the life of the data, including its capture, being passed through various parties, any modifications or tests run against it, and any other information pertinent to everything and everyone who has touched it or accessed it. The chain of custody serves to preserve confidence in the data, as well as giving the ability to question or challenge it based on its handling and preservation

During data discovery and classification, the use of metadata is a primary means for analysis. Which of the following would NOT be considered metadata? A. Column names B. Content C. Filenames D. Headers

Content Explanation: Metadata by definition is data about data, so the actual content of the data would not fit within this definition. Content analysis goes against the actual data and sources rather than against metadata.

Which phase of the cloud data lifecycle also typically entails the process of data classification? A. Use B. Store C. Create D. Archive

Create Explanation: Data classification should optimally be done immediately as part of the "create" phase. As soon as data is created, it is subjected to regulatory and other protection requirements, so classification must be applied immediately.

Which of the following actions do NOT fall under the "create" phase of the cloud data lifecycle? A. Newly created data B. Data that is imported C. Data that is archived D. Data that is modified

Data that is archived Explanation: Data that is archived involves taking already existing data and capturing a read-only static copy of it for long-term preservation. Although the data may be new to a particular storage medium, especially if it's archived on a system external from its original location, the format or content of the data is not changed during the process, and as such it's not newly created or modified data.

Different types of cloud deployment models use different types of storage from traditional data centers, along with many new types of software platforms for deploying applications and configurations. Which of the following is NOT a storage type used within a cloud environment? A. Docker B. Object C. Structured D. Volume

Docker Explanation: Docker is a container method for building, shipping, and running applications across different platforms and clients. It runs within its own agents and containers, is not a storage type under a cloud deployment model, and can be used with virtually any type of platform.

Which storage type is typically used by the cloud provider to house virtual machine images? A. Volume B. Structured C. Unstructured D. Object

Object Explanation: Object storage utilizes a flat hierarchy and catalogs data with an opaque file handler or descriptor. When storing virtual machine images, cloud providers typically use object storage because there is no reason to maintain an organized file structure, and an opaque descriptor works perfectly for virtual machine images and provides easy access to call them.

Which cloud storage type uses an opaque value or descriptor to categorize and organize data? A. Volume B. Object C. Structured D. Unstructured

Object Explanation: Object storage works by utilizing a flat file system, where each item is housed with an opaque handler or descriptor. When files are accessed, the object storage is called with the descriptor or opaque value and then presented to the application or client.

When DLP is used to protect data in use, where would the DLP solution and software be deployed? A. On the client B. On the application server C. Network perimeter D. Data layer

On the client Explanation: For a DLP solution to work for data in use, it would need to be installed on the actual client where the processing was taking place. This could be anything from a mobile device to a desktop computer, laptop, or software client. By being installed at the client level, the DLP solution would be able to closely watch how the data was being used and processed and to apply policies appropriately.

You are reviewing the standard offerings from a prospective cloud provider, and one area of log collection promises full and complete access to operating system logs for all provisioned systems. Which cloud service category is this MOST likely referring to? A. Platform B. Desktop C. Software D. Infrastructure

Infrastructure Explanation: With Infrastructure as a Service (IaaS), the cloud customer is responsible for the provisioning, configuration, and support of operating systems and virtual machines. As such, the customer has full access to all systems and operating system logs, whereas with the other service categories, these items would not be provided.

Although the preservation and retention of data are the most important concepts that usually come to mind when you're considering archiving, what process is equally important to test regularly for the duration of the required retention period? A. Recoverability B. Portability C. Encryption D. Availability

Recoverability Explanation: Over time, data that is archived can become inaccessible if an organization loses or deprecates the systems or technologies used to archive it, thus compromising its ability to restore that data. Regular tests should be done to ensure data can still be recovered, or the organization risks failing to meet regulatory requirements.

Which of the following is not a commonly accepted strategy for data discovery? A. Labels B. Metadata C. Signature hashing D. Content analysis

Signature hashing Explanation: Data discovery is focused on the three main areas: labels, metadata, and content analysis. Signature hashing may be used for some of these methods for comparison and efficiency, but it would be used as a tool or process rather than one of the key strategies.

The final phase of the cloud data lifecycle is the destroy phase, where data is ultimately deleted and done so in a secure manner to ensure it cannot be recovered or reconstructed. Which cloud service category poses the most challenges to data destruction or the cloud customer? A. Platform B. Software C. Infrastructure D. Desktop

Software Explanation: With Software as a Service (SaaS), data destruction often poses the most challenges because it is solely the responsibility of the cloud provider, but also because most platforms tend to be large implementations with many customers or tenants using them. This makes assurances of data destruction more challenging because the methods and resources available can be limited.

Which of the following technologies or concepts could be used for the preservation of integrity? A. DNSSEC B. Encryption C. Tokenization D. Anonymization

DNSSEC Explanation: DNSSEC is an extension on the traditional DNS protocol and allows for the signing and verification of DNS results. When a query is issued to the DNS servers, they sign the results using keys that can authenticate that they did come from an authoritative source. This strategy prevents DNS spoofing and redirecting attacks, and is implemented in a way that does not require additional lookups or calls.

When data is required to be archived and retained for extended lengths of time, which of the following becomes the most pressing concern over time? A. Encryption B. Size C. Restoration D. Availability

Restoration Explanation: As time passes and organizations change out software and other technologies, it is imperative that the ability to restore archives is maintained. Given that some regulatory requirements are several years or longer, and with the rapid development of technology and upgrades, it is very possible without proper planning for an organization to end up deprecating the capability to restore archives before they have reached their minimum retention time. When this happens, an organization would likely have to contract with a third party, likely at a very high cost, for restoration services should the need arise.

Which of the following types of solutions is often used for regulatory compliance reporting? A. SIEM B. DLP C. IRM D. IDS

SIEM Explanation: A security information and event management (SIEM) solution is used to collect, aggregate, and process event data throughout an application or even an entire infrastructure. It is often used to produce auditing and compliance reports based on the data that is collected and aggregated, and most SIEM solutions have very robust reporting, alerting, and dashboard capabilities.


Kaugnay na mga set ng pag-aaral

MAN Chapter 7: innovation and change

View Set

Pediatric Nursing Role and Growth/Development Quiz

View Set

Introduction to Art: design, meaning, context

View Set

Diagnostics Final Exam Study Guide

View Set

Understanding Business Chapter 11

View Set