CCSP

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Virtualized environments are not risk-free and require protection. Which of the following is least likely a risk specific to virtualization? A flawed VM Operating System (OS) or application could result in memory leaks A flaw in the hypervisor could facilitate inter-VM attacks Individual VMs can be starved of resources Virtual machine disk image files may be accessed by malicious subjects

A flawed VM Operating System (OS) or application could result in memory leaks Explanation Memory management and efficient computational resource allocation are important for maintaining availability but are not risk-specific to virtualization. Now, the important thing to understand is that this allocated memory will be released once the process is finished running. When you setup your VM, you set the maximum amount of memory it can consume. When you shut down your VM, it will also be released.

Which of the following is true about microsegmentation? Microsegmentation is the same as network segmentation A fundamental design requirement of microsegmentation is understanding the protection requirements for east-west (traffic within a data center) and north-south (traffic to and from the internet) traffic flows Microsegmentation is an optional activity of the Zero Trust Model, which aids in protecting against dynamic threats Enhanced application security replaces the need for microsegmentation

A fundamental design requirement of microsegmentation is understanding the protection requirements for east-west (traffic within a data center) and north-south (traffic to and from the internet) traffic flows Explanation Microsegmentation pays close attention to traffic types (east-west, north-south) and creates policies that address specific protection surfaces. A principal design and activity of the Zero Trust Model is microsegmentation. While network segmentation relies on firewalls and VLANS for creating sub-networks and for governing access, microsegmentation takes a more granular approach. Application security is not a substitute for microsegmentation. Microsegmentation identifies the flows of data between the applications and controls the flow via granular policies for specific workloads.

You host a SaaS application in a public cloud environment and are concerned that government authorities can seize customers' data from a specific geographical location. What technique can you use to limit the exposure to this risk? Data dispersion Data tokenization Data deletion Data anonymization

Data dispersion Explanation When using the data dispersion technique, each storage block is fragmented. The storage application writes each bit into different physical storage containers to achieve greater information assurance, just like the old-fashioned RAID system, which is only scattered across different physical devices and/or geographical locations. Data anonymization and data tokenization are not methods to limit the exposure to the described risk. The described risk may not be limited depending on when the data is deleted.

Which of the following is not a federation standard? OpenID Connect Lightweight Directory Access Protocol (LDAP) Open Authorization (OAuth) 2.0 Authorization Framework Security Assertion Markup Language 2.0 (SAML 2.0)

Lightweight Directory Access Protocol (LDAP) Explanation LDAP is not a federation standard. It is a protocol for directory services authentication.

What controls the entire infrastructure? Parts of it will be exposed to customers independent of network location. Control plane Management Plane Control plane and management plane

Management plane Explanation The management plane controls the entire infrastructure. Parts of it will be exposed to customers independent of network location. It is a prime resource to protect.

Digital signatures provide mechanisms for achieving which of the following? Message confidentiality Non-repudiation of the message receiver Message integrity Message availability

Message integrity Explanation It creates a digest of the message, which is used to assure message integrity. Since the message digest is signed using the private key of the message sender, non-repudiation of the sender can be achieved.

Which of the following items should a patch management process not address? Change management Subscription mechanism to vendor patch notifications Customer notification of applicable patches, if required Vulnerability detection and evaluation by the vendor Successful patch application verification New hardware, software, and documentation

New hardware, software, and documentation Explanation New hardware, software, and documentation fall under deployment management as opposed to patch management.

Commoditization of cloud services is on the rise, increasing the importance of understanding risks, vulnerabilities, threats, and attacks related to the cloud environment. From a SaaS perspective, which of the following would not apply? Regulatory compliance drift Misuse of data by malicious insiders at the provider and consuming organizations Potential firmware vulnerabilities Lack of transparency concerning what data is within applications

Potential firmware vulnerabilities Explanation In a SaaS deployment, firmware vulnerabilities are the cloud service provider's responsibility and are not a consumer concern. From a SaaS perspective, considerations should include, but not be limited to, lack of transparency concerning what data is within applications, misuse of data by malicious insiders at the provider and consuming organizations, and regulatory compliance drift.

Storage components such as volume storage and object storage can be replaced by which of the following cloud or on-premises versions? Select all that apply. Amazon Elastic File System SAN NAS Amazon EBS Azure Files Azure Disks Azure Blobs

SAN NAS Explanation The SAN and NAS concept of storage is continued in the cloud. Some cloud providers created specialized storage and services or solutions for each specific use case.

A company is in the market for a security product that meets its unique needs. It must be confident that the candidate products are vetted using an internationally accepted standard. In this context, which of the following is true? The company recognizes that FIPS 140-2 can serve its purpose The company recognizes that the certification guarantees security and signifies a high level of confidence in what a product does The company recognizes no internationally recognized guidelines and specifications for evaluating information security products. Each country develops its own set of specifications for product evaluation The company recognizes that ISO/IEC 15408 can serve its purpose

The company recognizes that ISO/IEC 15408 can serve its purpose Explanation Certification alone cannot guarantee a product's security. It can, however, provide confidence in a product's capabilities. ISO/IEC 15408 was developed to evaluate information security products.

The company has migrated most of its processes and data to the cloud. It uses cloud-based services for email, file storage, customer service, and HR. The data stored in the cloud includes both sensitive and nonsensitive data. Your company will be storing and processing sensitive information. It wishes to reduce the costs associated with maintaining a regulatory-compliant cloud storage solution. It is looking for alternatives to remove the regulation requirements associated with storing regulated data in the cloud while still being able to access sensitive information. Which response provides the best solution to meet business requirements? 1. Data anonymization 2. Data masking 3. Tokenization 4. Encryption of all information

Tokenization Explanation Tokenization is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token. The token is usually a collection of random values with the shape and form of the original data placeholder mapped back to the original data by the tokenization application or solution. Tokenization removes the data entirely from the database, replacing it with a mechanism to identify and access the resources. Tokenization can assist with complying with regulations and laws, reducing the cost of compliance, and mitigating the risk of storing sensitive information. The regulations would still be applicable if encryption were used and data was decrypted for processing or use in the cloud, and if not decrypted, it would make use or processing in the cloud infeasible.

'In the cloud, data storage can be manipulated into unauthorized usage, for example, by account hijacking or uploading illegal content. The multitenancy of cloud storage makes tracking unauthorized usage more challenging. ' The above statement is a definition of which of the following terms? Unauthorized usage Theft or accidental loss of media Unauthorized access Liability due to regulatory noncompliance

Unauthorized usage Explanation This statement defines unauthorized usage.

Your organization provides public cloud services to international clients. Your company has purchased a building that is in proximity to a river. You consult an empirical almanac on weather conditions and note that the river floods once every 10 years. If a flood occurs, damage to the service asset will be 10 percent. The service asset has a replacement value of $10,000,000. What is the annualized loss expectancy (ALE)? $100,000 $9,000 $10,000,000 $1,000,000

$100,000 Explanation Single-loss expectancy (SLE) = replacement value (10mil) x % damage. ALE = annualized rate of occurrence (ARO) 1/10 x SLE (1mil) = $100k.

Which of the following statements regarding the Key Management Interoperability Protocol (KMIP) are true? 1. KMIP is about enabling different key management system components to interoperate 2. KMIP can accurately be described as a supporting 'building block' technology for cryptography 3. It defines message formats for the manipulation of cryptographic keys on a key management server 4. KMIP is from OASIS (Organization for the Advancement of Structured Information Standards) 5. KMIP is not an open-source protocol

1-4 Explanation KMIP is an open-source tool from OASIS, which also defines message formats for the manipulation of cryptographic keys on a key management server. As such, it can accurately be described as a 'building block' technology for cryptography.

As a result of multitenancy, multiple users can store their data using the applications provided by SaaS. Within these architectures, the data of various users will reside at the same location or across multiple locations and sites. What is a key security consideration when protecting user data? 1. Data aggregation 2. Data manipulation 3. Data segregation 4. Data encryption

A SaaS model should ensure a clear segregation of each user's data, both at the physical and application levels. The service should be intelligent enough to segregate the data from different users.

In traditional networks, the control and data planes run in routers distributed across the network. In SDN, the control plane runs in a _____. Centralized router that executes policy Virtual Machine (VM) that controls the routers in the SDN A logically centralized controller that controls the routers in the SDN A logically decentralized controller that controls the routers in the SDN

A logically centralized controller that controls the routers in the SDN Explanation In SDN, the control plane runs in a logically centralized controller that controls the routers in the SDN.

'Helps the development team identify specific attacks against the application. ' The above statement best describes which of the following statements? Functional Testing Use Case Testing Abuse Case Testing Nonfunctional Testing

Abuse Case Testing Explanation Abuse Case Testing helps the development team identify specific attacks against the application.

Which of the following are examples of possible stages in the key life cycle for a typical cryptographic key? Select all that apply. 1. Key storage 2. Key use 3. Key destruction 4. Key distribution 5. Key generation

All Explanation All of these options represent possible stages in the life cycle for a typical cryptographic key. Like any sensitive data, cryptographic keys have a life cycle: they are created, stored, used, and destroyed.

Which of the following is the correct description of the 'Subject's Unique Identifier'? Algorithm used to sign the certificate Owner of the public key Allows for user-defined extensions to certificates An optional field in case the public key owner used more than one X.500 name

An optional field in case the public key owner used more than one X.500 name Explanation This could be a case where the subject has more than one X.500 name, like in the military or government, where a user can be part of DLA with service to the Army. This optional field provides flexibility in identifying the subject accurately. 'Owner of the public key' refers to the Subject's Name. 'Algorithm used to sign the certificate' refers to the Algorithm Used for the Signature. 'Allows for user-defined extensions to certificates' refers to Extensions.

You are trying to explain the risks associated with Shadow IT. Which of the following will your discussion points likely include? Open-source technologies or solutions may be procured, which always results in an increased attack surface and reduced security posture of the organization Due care and diligence can be easily applied to applications or services that are procured outside the view of IT and security An organization cannot manage what it cannot see; what is unmanaged is also unsafe Organizations do not need to deliver training on the proper use of cloud applications

An organization cannot manage what it cannot see; what is unmanaged is also unsafe Explanation There are risks associated with unmanaged software. For example, proper patching may not be performed. When services are procured outside the view of IT and security, third-party security due diligence may need to be performed appropriately. While any new software or service may increase the attack surface, there are better statements than this concerning open-source technologies and solutions. Organizations must deliver training on accepted cloud usage policies, and enforcement should be mandatory.

Which process best describes removing the indirect identifiers to prevent data analysis tools or other intelligent mechanisms from collating or pulling data from multiple sources to identify an individual or sensitive information? Anonymization Tokenization Overwriting Encryption

Anonymization Explanation Direct and indirect identifiers are two primary components for identifying individuals, users, or personal information. This process best describes anonymization. Encryption primarily serves a different purpose, and the process differs from what is described. The tokenization process differs from what is described, and so is the objective of tokenization. Overwriting is a method of sanitization to address the risk of data remanence on the media.

Users within a company leverage public cloud services to store company backup files for mobile devices and to provide additional storage space. Sometimes, essential company files are lost and cannot be retrieved. As a CCSP, you are tasked to derive a solution to the problem. Which do you think you should do first? Research the best storage solution that meets the corporate business requirements. Conduct a risk assessment and provide risk treatment Assess the organizational requirements so you can encourage governance and policies that meet the corporate business requirements Do nothing, as typically problems of this nature resolve themselves

Assess the organizational requirements so you can encourage governance and policies that meet the corporate business requirements Explanation The organization first needs to have governance and a policy that specifies what can be done and who has the authority to decide to carry out actions.

What is the formula for calculating single-loss expectancy (SLE)? Exposure Factor (as a % of loss) * Asset Value ($) = SLE Exposure Factor (as a % of loss) / Asset Value ($) = SLE Asset Value ($) / Exposure Factor (as a % of loss) = SLE Asset Value ($) * Exposure Factor (as a % of loss) = SLE

Asset Value ($) * Exposure Factor (as a % of loss) = SLE Explanation Single-loss expectancy (SLE) is defined as the difference between the original value and the remaining value of an asset after a single exploit. As such, the formula for calculating SLE is Asset Value ($) * Exposure Factor (as a % of loss) = SLE. As this is a multiplication, it does not matter which order the terms are in.

An organization is concerned that the critical services it consumes from the cloud are vulnerable to several threats and has a quantified list of what would happen if it lost availability to these services. It acquired countermeasures to recover service availability within a predetermined time to avoid experiencing permanent business failure. What activity is the organization currently engaged in, and what is the next step? Cyber kill chain review, next consider weaponization Cyber kill chain rules, next consider group decision making Business continuity management, next contemplate and decide upon appropriate test scenarios Business continuity management, next garner leadership sponsorship

Business continuity management, next contemplate and decide upon appropriate test scenarios Explanation The organization is involved in business continuity management. If it has acquired countermeasures following BCMS process flow, it has received sponsorship from leadership at the initiation of the process so now is time to test the solution. Having services available within a predetermined period should also serve as a specific indicator of what activity the organization is currently involved in.

You are about to purchase movie tickets, but the website offering them asks you for your parents' names, which you think is excessive to purchase the tickets. Based on the OECD's privacy recommendations, which principle is not being followed by the website? Data Quality Principle Openness Principle Collection Limitation Principle Purpose Specification Principle

Collection Limitation Principle Explanation The Collection Limitation Principle states that there should be limits to the collection of personal data, and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject. The Purpose Specification Principle states that the purposes for which personal data is collected should be specified not later than the time of data collection, and the subsequent use should be limited to the fulfillment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose. The Openness Principle states that there should be a general policy of openness about developments, practices, and policies concerning personal data. The Data Quality Principle states that personal data should be relevant to the purposes for which they are to be used and, to the extent necessary, should be accurate, complete, and kept up to date.

Your organization is planning to move to the cloud and is evaluating various cloud service providers. One of the main factors for selection is their security posture. What industry standard tool can you use to assess the overall security capabilities of a cloud provider? Cloud Assessment Questionnaire Cloud Security Assessment Checklist Cloud Security Risk Checklist Consensus Assessments Initiative Questionnaire

Consensus Assessments Initiative Questionnaire Explanation The Consensus Assessments Initiative Questionnaire (CAIQ) is a survey provided by the Cloud Security Alliance (CSA) for cloud consumers and auditors to assess the security capabilities of a cloud service provider. The CAIQ was developed to create commonly accepted industry standards to document the security controls in infrastructure-as-a-service, platform-as-a-service, and software-as-a-service applications. The provider distributes CAIQ to potential customers or publicly posts it in the STAR registry.

Two main categories of digital rights management (DRM) are: 1. Audio DRM and video DRM 2. Open-source DRM and proprietary DRM 3. Consumer DRM and enterprise DRM 4. One-way DRM and two-way DRM

Consumer DRM and enterprise DRM Explanation Consumer DRM aims at controlling copying, execution, and alteration of media such as audio, video, and e-books. Enterprise DRM Focuses on protecting enterprise assets such as documents and email through implementation of usage rights policies.

The company has migrated most of its processes and data to the cloud. It uses cloud-based services for email, file storage, customer service, and HR. The data stored in the cloud includes both sensitive and nonsensitive data. The company is researching different cloud data storage models for regulated data. As a result, the company wants to ensure it has control over where data is stored and define the jurisdictions that will apply to that location. Which storage type would make meeting the business requirement most challenging? Content delivery network Long-term storage Raw storage Ephemeral storage

Content delivery network Explanation With a content delivery network, content is stored in object storage, which is then distributed to multiple geographically distributed nodes to improve internet consumption speed. Due to the distributed nature of this model, there is the potential to cross jurisdictions, which may change legal requirements associated with the data.

ISO/IEC 27018 focuses on protecting personal data in the cloud. Cloud service providers adopting this standard must operate under five key principles. Which principle provides customers with explicit control of how their information is used? Transparency Consent Control Communication

Control Explanation Customers have explicit control of how their information is used. Cloud service providers must inform customers where their data resides, disclose the use of subcontractors to process PII, and make clear commitments about how that data is handled.

Choose the option that shows the secure data life cycle phases in the correct order. 1. Create, Use, Archive, Remove 2. Use, Archive, Create, Destroy 3. Create Use, Archive, Destroy 4. Use, Store, Archive, Destroy

Create, Use, Archive, Destroy Explanation The correct order as presented is Create, Use, Archive, and Destroy. Remove is not one of the secure data life cycle phases.

The secure data lifecycle comprises six phases. What are they? 1. Create, store, use, share, archive, and destroy 2. Classify, use, store, share, archive, and sanitize 3. Classify, store, use, share, archive, and destroy 4. Create, use, store, share, archive, and sanitize

Create, store, use, share, archive, and destroy Explanation The six phases of the data lifecycle are create, store, use, share, archive, and destroy.

An organization has been notified that a spurious or bogus subdomain (sometimes.heretohelp.biz) of its legally owned domain (heretohelp.biz) is circulating misinformation and fraudulent transactions. What is the most likely cause that led to the appearance of a spurious subdomain? DNS shadowing Corrupt PTR (reverse lookup) record Resolver A record mismatch Amplification and reflection

DNS shadowing Explanation DNS shadowing is a threat where the attacker gains access to the domain registrant's account and creates subdomains from the parent domain of the victim to draw unsuspecting visitors to bogus sites that are hosted under the newly created subdomains.

Which data governance role is responsible for applying access controls, meeting compliance requirements, retaining data, or other protection requirements that have been defined by the data owner? Data Custodian Chief Information Security Officer (CISO) InfoSec Manager Data Steward

Data Custodian Explanation Data custodianship is a data governance role that handles data assets at some point in their life cycle. They become "custodians" of the data asset at one or more points in the data life cycle and are responsible for applying access controls, meeting compliance requirements, retaining data, or other protection requirements that the data owner has defined. They typically do not set or define rules; they apply and enforce them.

Which of the Organization for Economic Cooperation and Development (OECD) principles states, "Personal data should be relevant to the purposes for which they are to be used, and, to the extent necessary for those purposes, should be accurate, complete, and kept up to date"? Openness Principle Collection Limitation Principle Data Quality Principle Security Safeguards Principle

Data Quality Principle Explanation This statement is OECD's Data Quality Principle. The Security Safeguards Principle states that personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data. The Collection Limitation Principle states that there should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject. The Openness Principle states that there should be a general policy of openness about developments, practices, and policies with respect to personal data.

'Sorting the data based on locations, compliance requirements, ownership, business usage, or value to the organization. ' The above statement is a definition of which of the following terms? Legislation, regulation, and standards requirements Data mapping Data classification Data retention procedure Monitoring and maintenance

Data classification Explanation Data classification is sorting the data based on locations, compliance requirements, ownership, business usage, or value to the organization.

Data ownership is a data governance role. Data owners ultimately define what is allowed or required in managing a type or specific set of data. Responsibilities associated with this role typically do not include defining rules around which of the following? Data access Data retention Data collection Data protection

Data collection Explanation Data owners may define rules and requirements around data protection and do the same for data access and retention. Data stewards typically ensure that data is collected correctly.

Regarding data discovery and classification, which of the following statements is correct? Most organizations today can map exactly where their sensitive data is located and what security controls are deployed to guard it Data discovery normally occurs after data classification. Essentially, data discovery is asking the question, "What types of data do we have, and where is it located?" Data discovery is always a fully manual process with human review and decision-making

Data discovery asks, "What types of data do we have, and where is it located?" Explanation The objective is to determine the data types and their locations (e.g., within the IT infrastructure). Despite the large volume of controls and efforts to protect various data types, only some organizations can map exactly where their sensitive data is located and what security controls are deployed to guard it. Although good definitions for data classifications can make discovery more effortless, classification is derived from the data types on hand. Data discovery commonly occurs before classification. There are many potential approaches to data discovery. Data discovery can be entirely manual with human review and decision-making or automated using tools.

'This defines data types (structured, unstructured), data formats, file types, and data location (network drives, databases, object or volume storage). ' The above statement is a definition of which of the following terms? Monitoring and maintenance Data mapping Data retention procedure Data classification Legislation, regulation, and standards requirements

Data mapping Explanation Data mapping defines data types (structured, unstructured), data formats, file types, and data location (network drives, databases, object or volume storage).

Who holds legal rights and complete control over data elements? Data user Data owner Data steward Data custodian

Data owner Explanation The data owner is responsible and accountable for defining rules for a data asset and controlling what is allowed or required in managing a type or specific set of data. Data custodians are responsible for the safe custody, transport, and storage of the data and implementation. Data users are individuals who use data assets to perform some business function or task. A data steward is responsible for the safe custody, transport, and storage of the data and implementing business rules.

As an architecture, Software-Defined Networking (SDN) aims to simplify network management and make networks more flexible. SDN is defined by three separate planes or layers. Which plane are network devices such as switches and routers located at? Control plane Management plane Network plane Data plane

Data plane Explanation SDN planes are the management plane, control plane, and data plane. The network switches and routers located at the data plane are associated with the infrastructure.

Which of the following is not included in the cloud infrastructure components? Virtualization Network and communication Data security Storage Area Network (SAN)

Data security Explanation Data security is one of the benefits of cloud infrastructure. Cloud resources are delivered over the internet. Communication channels allow information to travel from cloud systems to devices. Storage area networks are primarily used to enhance storage devices. In virtualization, users get a virtual version of the hardware resources.

Which term best describes the implied or explicit right to decide what treatment, care, or disposition (embargo or movement) a nation or state can determine on data by means of its laws? Data sovereignty Due diligence Due care Data residency

Data sovereignty Explanation This concept is best described using the term data sovereignty. Due diligence is a code of conduct. Due care is a standard of behavior grounded in the concept of reasonableness. Data residency is focused on the geographical area where data is stored, likely for regulatory reasons.

Your company wants to access its data anytime from anywhere. As a result, it has decided to begin migrating from local file servers to cloud storage. You have been tasked with identifying the appropriate controls to apply. Part of defining the access control model is defining actors, functions, and locations. When defining the functions that can be performed, what controls best address the process or use function? Database access monitors, information rights management, and data loss prevention Encryption, information rights management, and crypto-shredding Information rights management, data classification, and data destruction Data classification, data destruction, and encryption

Database access monitors, information rights management, and data loss prevention Explanation The process function occurs during the create and use phases of the cloud data life cycle. Data in use is most vulnerable because it can be transported to an insecure location, such as a workstation, and to be processed it must be unencrypted. Controls such as data loss prevention (DLP), digital rights management (DRM), and database and file access monitors should be implemented to audit data access and prevent unauthorized access.

Fundamental cloud computing services do not include which of the following? 1. Database services 2. Compute services 3. Storage services 4. Networking services

Database services Explanation Fundamental cloud computing services include compute services, storage services, and networking services. While many cloud services offer a database, it is considered a subset of storage services.

Your company wants to access its data any time from anywhere. As a result, it has decided to begin migrating from local file servers to cloud storage. You have been tasked with identifying the appropriate controls to apply. You have been tasked with developing an access control model for the company's cloud storage. First, you review the locations from which data can be accessed. What control would best allow the control of access without being too restrictive? 1. Define the user base and restrict access based on CIDR blocks 2. Only allow access from within the corporate network 3. Define access based on geolocation 4. Block access from known hostile IP networks

Define access based on geolocation Explanation Identifying the location of the user base and limiting access based on geolocation of the device attempting access would fulfill this requirement. Geolocation access control would be one part of an overall access control model.

Which of the following is not included in audit planning? Define remediation plan Refine audit process from lessons learned Fieldwork Define audit scope

Define remediation plan Explanation 'Define remediation plan' is not included in audit planning.

Cloud service providers offer tools for event log collection, correlation, analysis, and alerting. Regardless of the tool used, there are some challenges to overcome. Which of the following is least likely to be considered as one of these challenges? Prioritizing alerts based on the level of risk Determining the log retention period Establishing a definition and picture of what normal looks like before deviations and anomalies are detected Weeding out false positives

Determining the log retention period Explanation Log retention can be determined by considering internal policies and external requirements posed by applicable regulations and laws. Reducing or eliminating false positives and determining the risk level to prioritize alerts are common challenges. To be able to detect anomalies, a definition and picture of what constitutes normal would be needed.

You have heard of DevOps and understand the basic principles behind it. DevSecOps, on the other hand, is somewhat of a hazy concept. Your research on this topic indicates that: DevSecOps concerns bridging the gap between security and DevOps DevSecOps is no different than DevOps, with the only exception being that it requires more frequent penetration testing All the traditional application security activities that you are familiar with from your previous experience with the Waterfall model can be adopted as-is in this new environment Security must be sacrificed to allow development to deliver more code more frequently

DevSecOps concerns bridging the gap between security and DevOps Explanation 'DevSecOps concerns bridging the gap between security and DevOps' captures the essence of why DevSecOps is needed. Sacrificing security for the rapid delivery of code should not be considered an option. DevSecOps is not about more frequent penetration testing. The traditional resource-intensive and heavyweight security activities and controls at various touchpoints throughout the software development life cycle must change to allow for the adaptation principles that have proven successful for DevOps.

You recognize that a complete cloud solution will incorporate many technical components and configurations to help you achieve your security objectives. In this context, which control meets the objective of preventing unauthorized data viewing? 1. Identity and access management 2. Obfuscation, anonymization, tokenization, and masking 3. Data leakage/loss prevention 4. Encryption

Encryption Explanation Encryption prevents unauthorized data viewing. Data leakage/loss prevention audits and prevents unauthorized data exfiltration. Obfuscation, anonymization, tokenization, and masking are alternatives for data protection without encryption. Identity and access management control who has access.

Which of the following are features of DRM? Select all that apply. 1. DRM is useful for protecting sensitive organization content. 2. DRM is agnostic to the location of the data. 3. DRM adds an extra layer of access controls on top of the data object or document. 4. baseline for the default information protection policy.

Explanation All of these responses are features of DRM.

In this application security testing method, the application under test is validated against its requirements considering the inputs and expected outputs, regardless of how the inputs are transformed into outputs. Testers are least concerned with internal structure or code that implements the business logic of the application: Black-box testing Interactive Application Security Testing (IAST) White-box testing Functional testing

Explanation Black-box testing is completed without knowledge of the internals of the application. Functional testing validates that an application performs to the functional descriptions and requirements. This testing type validates that an application performs the functional descriptions and requirements outlined in the software requirements specification. White box application security testing is conducted with knowledge of the internals of the application. This application security testing method validates how the business logic of the application is implemented. IAST relies on a combination of static and dynamic application security testing methods to achieve its goal. It is best suited for web applications and web APIs. This application security-testing method performs behavioral analyses. It relies on a combination of static and dynamic application security-testing methods to achieve its goal through its visibility into both the source code and the execution flow during the runtime.

'The practice relies on the automation of many routine tasks that turn code changes into working software.' The above statement best describes which of the following statements? Waterfall Model DevOps/DevSecOps Continuous Integration/Continous Deployment Agile Model

Explanation Continuous Integration/Continuous Deployment relies on automating many routine tasks that turn code changes into working software.

Which of these identifies potential security vulnerabilities of web applications and their infrastructure? Interactive Application Security Testing (IAST) Dynamic Application Security Testing (DAST) Software Composition Analysis (SCA) Static Application Security Testing (SAST)

Explanation Dynamic Application Security Testing (DAST) aims to identify potential security vulnerabilities in web applications and their infrastructure. This requires the application to run in production or production-like environments for testing to be accurate and is also suitable for identifying configuration issues. SAST describes a set of technologies used to analyze application source code, byte code, and binaries for coding and design conditions that can indicate security vulnerabilities. Third-party/open-source components are an essential part of contemporary software development. Critical vulnerabilities identified in third-party/open-source components, libraries, and frameworks highlight the need for SCA. Best suited for web applications and web APIs, IAST aims to perform behavioral analysis. Information related to this

Implementation of appropriate security architecture for the cloud is essential to withstand cyberattacks. Which of the following will likely contribute to potential data exposures in this context? Complete documentation of risks associated with cloud migration Assuming that cloud migration is always a lift-and-shift endeavor of porting existing IT stacks and security controls to a cloud environment Using a single instance of the software and its supporting infrastructure to serve many clients A complete understanding of the shared security responsibility model

Explanation Migration to the cloud simply as a lift-and-shift endeavor without proper planning and strategy can lead to vulnerabilities and data exposure. The lack of understanding of the parties' security responsibilities in the cloud environment may leave gaps, which can lead to vulnerabilities and data exposure. Documenting the risks associated with a cloud migration is more likely to reduce potential exposures than to contribute to them. Using a single instance of the software and its supporting infrastructure to serve multiple clients may have associated concerns in general, but the question is within the context of implementing appropriate security architecture.

NIST supports the FIPS 140-2/3 cryptographic validation program with which of the following? Cryptographic Algorithm Validation Program (CAVP) Testing by independent commercial laboratories Cryptographic Module Validation Program (CMVP) Testing by standards agencies from EU countries

Explanation NIST supports this program with CAVP, CMVP, and testing by independent commercial laboratories. NIST does not support this program with testing from standards agencies in other countries; NIST is an American institute.

What are trusted platform modules (TPMs) comprised of? Select all that apply. Processor ROM Hash functions Flash memory RAM Key generation

Explanation Processor, RAM, ROM, and Flash Memory comprise the Trusted Platform Modules. The cryptographic subsystem implements the TPM's cryptographic functions, and some of those operations include hash functions and key generation.

What are the advantages of symmetric key algorithms? Select all that apply. They are very fast They are the best kind of key algorithms They are easy to use They are secure They are cheap

Explanation Symmetric key algorithms are usually very fast, secure, and cheap. They are not necessarily easy to use or the best key algorithms.

'These two concepts build on each other. One includes integrating development and operations process; the other adds incident management and vulnerability testing.' The above statement best describes which of the following statements? Agile Model Waterfall Model Continuous Integration/Continous Deployment DevOps/DevSecOps

Explanation The DevOps/DevSecOps concepts build on each other. One includes integrating development and operations processes. The other adds incident management and vulnerability testing.

Which of the following is best described as form-factor independent executable packages of software that are installed and maintained upon a host operating system and have everything needed to run an application? Microservices Infrastructure as a Service (IaaS) Containers Function as a Service (FaaS)

Explanation The description in the question best fits containers. IaaS is considered a service model and is not a good fit for the description provided. Microservices form an architecture that segregates functions of an application into discrete, decentralized, and business objective-driven processes. While there are important commonalities between FaaS and containers, they are not the same thing. With FaaS, consumers do not manage the underlying infrastructure and focus on the code.

True or False: REST is designed to be highly extensible. False True

False Explanation SOAP is the heavyweight choice for web service access and is designed to be highly extensible. REST provides a lighter-weight alternative and relies on a simple URL instead of XML. SOAP is designed to support expansion, so it has all sorts of other acronyms and abbreviations associated with it, such as WS-Addressing, WS-Policy, WS-Security, WS-Federation, WS-ReliableMessaging, WS-Coordination, WS-AtomicTransaction, and WS-RemotePortlets. You can find a laundry list of these standards on Web Services Standards. The point is that SOAP is highly extensible, but you only use the pieces you need for a particular task. For example, when using a public web service that's freely available to everyone, you really don't have much need for WS-Security.

Which of the following are included in the NIST Framework for Improving Critical Infrastructure Cybersecurity? Select all that apply. Framework Profiles Risk Treatment Framework Core Framework Implementation Tiers

Framework Profiles Framework Core Framework Implementation Tiers Explanation The NIST Framework for Improving Critical Infrastructure Cybersecurity includes framework core, framework implementation tiers, and framework profiles. Risk treatment is a general term for doing something about risk that has been identified and evaluated.

In the context of web services, most services found on the web today exclusively use REST and exchange all required information using a URL and four primary hypertext protocol calls that include which of the following? GET, SET, CALL, and PUT GET, PUT, CALL, and DELETE GET, SET, GO, and REST GET, POST, PUT, and DELETE

GET, POST, PUT, and DELETE Explanation The four primary hypertext protocol calls are GET, POST, PUT, and DELETE.

Which of the following acts includes the Financial Privacy Rule? Australian Privacy Principles Gramm-Leach Bliley Act (GLBA)

Gramm-Leach Bliley Act (GLBA) Explanation The Financial Privacy Rule is part of the GLBA and regulates the collection and disclosure of private financial information.

Which of the following statements about hashing are true? You can select all that apply. Hashing can be defined as one-way encryption. Naturally occurring hash collisions are statistically unlikely but can theoretically occur. Hashing is non-deterministic There is an infinite possible number of hash outputs for any algorithm

Hashing can be defined as one-way encryption. Naturally occurring hash collisions are statistically unlikely but can theoretically occur. Explanation Naturally occurring hash collisions are statistically unlikely but can theoretically occur. A hash procedure must be deterministic—meaning that for a given input value it must always generate the same hash value. In other words, it must be a function of the data to be hashed in the mathematical sense of the term. Hashing is deterministic, and any algorithm has a finite (if very large) possible number of hash outputs.

An attacker infected a hypervisor with a rootkit virus and now can control all virtual machines hosted in that environment. What is this type of attack called? 1. VM Attack 2. Hyperjacking 3. Ransomware 4. VM Takeover

Hyperjacking Explanation Active VMs are vulnerable to all traditional attacks that can affect physical servers. Once a VM is compromised, it may be able to attack other VMs running on the same physical host because the VMs share the same hardware and software resources. This is a type of virtual machine attack.

The company uses a custom application that services financial clients. The application is hosted on company servers within its confidential data center. All company servers run specific configurations and a software firewall. The company wants a test environment that offers control over the environment to ensure that the configuration settings of both environments are alike but wants to minimize the costs and time involved with setting up the environment. Which service model best meets the business requirements? 1. PaaS/hybrid 2. SaaS/private 3. IaaS/public 4. IaaS/private

IaaS/Public Explanation IaaS, in a public offering, will provide the most control over the environment, ensuring that configuration settings of the operating system, local software firewall settings, and applications in the cloud mirror the on-premises environment. Employing a public offering reduces the costs associated with setting up the environment because the compute resources are shared among various organizations. No requirement to isolate the environment in a hybrid or private model was identified in the scenario. The capability to self-provision a new environment will occur quickly in a public/IaaS environment, making this a desirable approach.

What are the main elements of a data discovery approach? Select all that apply. Identification of specific data Mechanisms for identification Data locations

Identification of specific data Mechanisms for identification Data locations Explanation Identification of specific data, data location, and mechanisms for identification are the three main elements of data discovery.

Capabilities common to data rights management (DRM) solutions include which of the following? 1. Identifying sensitive organization content 2. Allowing document access at any time 3. Tracking all document activity through a complete audit trail 4. Enabling copy/paste and screen capture capabilities

Identifying sensitive organization content Explanation DRM is useful for protecting sensitive content, but the organization must identify what content is sensitive. Disabling copy/paste and screen capture capabilities, expiring or revoking document access capability, and tracking all document activity are common to DRM solutions.

Which of the following is not a true statement with respect to misuse and abuse cases? Various security-oriented methodologies (e.g., Microsoft SDL) regard misuse and abuse cases as an effective tool for modeling security requirements In contrast, to use cases, misuse, and abuse cases are reflections of the end-user views Abuse cases can help with the determination of safeguards and countermeasures. Abuse cases can help the development team identify specific attacks against the application.

In contrast to use cases, misuse, and abuse cases are reflections of the end-user views Explanation In contrast to use cases, misuse and abuse cases do not reflect the end-user views. Their development requires thinking from the perspective of malicious subjects who are aiming to inflict damage.

Your company wants to access its data anytime from anywhere. As a result, it has decided to begin migrating from local file servers to cloud storage. You have been tasked with identifying the appropriate controls to apply. You are reviewing the corporate strategy for using the cloud storage system to share data with employees and clients. Which controls are candidates during the sharing phase of the cloud data life cycle? 1. Information rights management, data destruction, and access controls 2. Data loss prevention, data destruction, and data classification 3. Information rights management, data loss prevention, and encryption 4. Data classification, backups, and encryption

Information rights management, data loss prevention, and encryption Explanation Sharing data may occur between employees, customers, and partners. Once the data is shared outside the organization, it is no longer under the control of the organization. Technologies such as DLP can be used to detect unauthorized sharing of sensitive information, DRM technologies can be used to maintain control over the information (e.g. restricting printing and further sharing), and encryption can be used to safeguard the transportation and future storage of sensitive information.

What can privacy maturity models help with? Select all that apply. Internal assessment for self-improvement Comparison with industry or other entities The creation of values and benefits Identification of areas requiring investment

Internal assessment for self-improvement Comparison with industry or other entities Identification of areas requiring investment Explanation Privacy maturity models are a recognized means by which organizations can measure their progress against established benchmarks. As such, privacy models can help with internal assessments, comparison with industry and other entities, and identification of areas requiring improvement. The creation of values and benefits would only be possible if an organization achieves a high maturity level.

From the perspective of the cloud service provider (CSP), a service-level agreement (SLA): Does not establish legal binding and should be considered as best efforts Is unnecessary, as the elastic nature of a cloud service will meet the needs of all service consumers Is needed only for critical services as determined by the service provider Is by design the perspective of the CSP

Is by design the perspective of the CSP Explanation SLAs are legally binding agreements that represent the specific criteria that a cloud service provider must meet for the customer. SLAs should be considered as formal and legally binding agreements. The elastic nature of a cloud service does not eliminate the need for SLA. It is not up to the CSP to determine which services are considered as being critical for service consumers.

What statement is true about Trusted Platform Module (TPM) architecture? It contains RAM, processor capability, and cryptographic algorithms and depends on external storage. It contains RAM, processor capability, cryptographic algorithms, and BIOS and is dependent on external storage. It contains RAM, processor capability, and cryptographic algorithms and has no storage. It contains RAM, processor capability, and cryptographic algorithms and depends on internal embedded storage.

It contains RAM, processor capability and cryptographic algorithms and is dependent on internal embedded storage Explanation TPM architecture contains RAM, ROM, flash memory, processor capability, and cryptographic algorithms and depends on internal embedded storage.,

Which of the following is true about benefits of symmetric encryption? It can help achieve confidentiality and non-repudiation It requires only a single private key, hence, posing less key management challenges as compared to asymmetric encryption It is slower compared to asymmetric encryption It is computationally less expensive as compared to asymmetric encryption

It is computationally less expensive as compared to asymmetric encryption Explanation Symmetric encryption is computationally less expensive when compared to asymmetric encryption. Symmetric encryption is fast in comparison with asymmetric encryption. Use of a single encryption key creates more challenges for key management. Symmetric encryption may only be used to achieve the confidentiality objective.

Which of the following best describes how pseudonymization works at a high level? 1. It works by combining concepts from tokenization and anonymization applied to privacy information 2. It works by adding a layer of masking between the application and the database on the fly 3. It works by replacing or appending the value with a random value 4. It works by using specific characters to hide certain parts of the data, usually applied for credit card data formats

It works by combining concepts from tokenization and anonymization applied to privacy information Explanation 'It works by adding a layer of masking between the application and the database on the fly' best describes dynamic masking. 'It works by using specific characters to hide certain parts of the data, usually applied for credit card data formats' best describes the process of masking in general. 'It works by replacing or appending the value with a random value' best describes random substitution.

'Certain controls such as encryption might be required to comply with certain regulations. Not all cloud services enable all relevant data controls.' The above statement is a definition of which of the following terms? Unauthorized access Unauthorized usage Theft or accidental loss of media Liability due to regulatory noncompliance

Liability due to regulatory noncompliance Explanation The statement best defines liability due to regulatory noncompliance.

Select the global reference architecture organizations use to design and assess cloud services and infrastructure. 1. Cloud Security Alliance (CSA) Security Trust Assurance and Risk (STAR) 2. NIST Cloud Computing Reference Architecture and Taxonomy 3. ISO/IEC 17789 Cloud Computing Reference Architecture (CCRA) 4. ISO/IEC 27001 Information Security Management System (ISMS)

NIST Cloud Computing Reference Architecture and Taxonomy is specific to the cloud. It may be used to organize the components of a cloud architecture into a set of common elements using common terminology. However, the ISO/IEC reference architecture is preferred for a global organization and is the better answer. ISO/IEC 17789 Cloud Computing Reference Architecture (CCRA) is specific to the cloud. It may be used to organize the components of a cloud architecture into a set of common elements using common terminology.

Which statements about a Hardware Security Module (HSM) are factual? Select all that apply. One of the major disadvantages of an HSM is that it is a physical device One of the key advantages of an HSM is that it is a physical device An HSM does not provide physical tamper protection An HSM provides cryptographic processing An HSM manages cryptographic keys

One of the major disadvantages of an HSM is that it is a physical device An HSM provides cryptographic processing An HSM manages cryptographic keys Explanation A Hardware Security Module (HSM) is a physical computing device that provides cryptographic processing and manages cryptographic keys. One of its key advantages is its physical tamper protection; the downside is that it is a physical device.

Which of the following is the best example of a global security standard that is relevant to a particular industry? ISO/IEC 27002 PCI DSS NIST Special Publication 800-53 General Data Protection Regulation

PCI DSS Explanation PCI DSS is an example of a global security standard that is relevant to the payment card industry. ISO/IEC 27002 is not specific to a particular industry. It provides guidelines for organizational information security standards, including the selection, implementation, and management of controls. NIST Special Publication 800-53 is a guidance document that includes a security control catalog. It has gone through a number of iterations and was originally intended primarily for U.S. federal government use. Due to wide adoption outside the U.S. government, the guidance has been updated and the term "federal" removed from the title. GDPR is not industry-specific and is a regulation, not a standard. It applies to all organizations who handle data belonging to anyone in the European Union.

A startup company wants to become a SaaS provider. It intends to develop and sell the use of an application while minimizing staff requirements, as well as initial and ongoing expenditures. Which company service model best meets these requirements? 1. SaaS 2. PaaS 3. On-Premises 4. IaaS

PaaS Explanation PaaS allows the consumer to deploy consumer-created or acquired applications onto the cloud infrastructure using programming languages, libraries, services, and tools supported by the provider. This provides the startup company a platform to deploy its application and provide access to consumers through a SaaS service model while minimizing staffing requirements.

Which of the following focuses on defining roles and responsibilities as well as service commitments for protecting privacy information between a service provider and consumer? Privacy Impact Assessment (PIA) General Data Protection Regulation (GDPR) Service Level Agreement (SLA) Privacy Level Agreement (PLA)

Privacy Level Agreement (PLA) Explanation A PLA is similar in concept to an SLA in that it defines roles and responsibilities and clearly defines service commitments for protecting privacy information between a service provider and consumer. SLA is generally used for the purpose of capturing metrics and various other information relevant to the performance of the service. GDPR is an EU law that aims to protect the rights of data subjects. PIA establishes a method for evaluating privacy in information systems.

What is the best way to demonstrate to an auditor that a consistent baseline image has been maintained for the past year in an environment where servers are built to a baselined image? Provide the baseline image and the forensically sound logfile of each instance created and the method of ensuring that drift did not occur in production Grant the auditor access to the production environment, along with the baseline configuration specification so that the auditor can determine compliance Provide the baseline image and the forensically sound logfile created for each instance and the change records for the configuration item under audit alongside the method of ensuring that drift did not occur in production Provide snapshots of the current servers and the accompanying configuration requirements, the method of ensuring that drift did not occur in production, and the original image baseline

Provide the baseline image, the forensically sound logfile created for each instance, and the change records for the configuration item under audit alongside ensuring that drift did not occur in production. Explanation To demonstrate process control sufficiently, the auditor will require proof of a baseline image and additional artifacts such as logs and change records. Assurance would be further enhanced by including a mechanism that guarantees a configuration change away from the desired baseline did not occur.

The company wishes to migrate the second business process, which is the human resources system, to the cloud. It has selected a SaaS cloud service model. When using a SaaS cloud service model, what tool is most effective at ensuring the security of a company's data? 1. Defining the geographic restrictions of where the company's data is authorized to reside 2. Ensuring all company data is encrypted before being loaded into the SaaS application 3. Reviewing the shared security model between CSP and consumer and ensuring that CSP and consumer responsibilities are met prior to using the service 4. Ensuring the company has the right to audit

Review the shared security model between CSP and consumer and ensure that CSP and consumer responsibilities are met before using the service. Explanation A review of the shared security model before using the service and the assurance that the service is implemented correctly are the most effective responses and the most likely to provide comprehensive security protections. CSP responsibilities may be available through a review of third-party audits or other documentation. Properly configuring service elements within the scope of customer control is critical for employing any cloud service.

Your company wants to access its data any time from anywhere. As a result, it has decided to begin migrating from local file servers to cloud storage. You have been tasked with identifying the appropriate controls to apply. While data is resident on cloud storage, which of the following is the best solution to provide protection? 1. Role-based access control 2. Classify the sensitivity of the data 3. Digital signatures 4. Information rights management

Role-based access control Explanation Role-based access control would limit unauthorized access to files.

Which of the following is a protocol that uses XML to work more efficiently over the internet even though some find it to be cumbersome? SOAP DCOM CORBA REST

SOAP Explanation SOAP is a protocol that uses XML and is cumbersome by many developers. REST is not a protocol and relies on simple URLs instead of XML. REST is considered efficient and inexpensive. While messaging services using SOAP are implemented with XML, REST frequently relies on JavaScript Object Notation (JSON) and supports other formats. DCOM and CORBA are legacy technologies that rely on binary messaging and do not work well over the internet.

An organization recently received an audit report from external auditors involving the organization's financial statements. This report was on the fairness of the presentation of management's description of the organization's system and the suitability of the design and operating effectiveness of the controls to achieve the related control objectives included in the description throughout a specified period. What type of report has it received? SOC 2, Type 2 SOC 2, Type 1 SOC 1, Type 1 SOC 1, Type 2

SOC 1, Type 2 Explanation A SOC 1 report focuses on controls at a cloud service provider that are likely to be relevant to an audit of a subscriber's financial statements. However, Type 1 implies that the report is on the fairness of the design of the control as of a specified date. SOC 2 reports do not involve the organization's financial statements.

For the potential client to understand the probability that your department of 50 developers will remain properly compensated and incentivized to continue to support the SECaaS that they wish to consume, what report might they consider? SOC 2, Type 2 SOC 1, Type 1 SOC 2, Type 1 SOC 1, Type 2

SOC 1, Type 2 Explanation SOC 1 is for the review of financial controls. Type 2 provides proof of the design's effectiveness.

Your organization develops security-as-a-service (SECaaS) software that is consumed via your hybrid cloud. You employ 50 developers who practice agile discipline in releasing tools to market. A potential client approaches your organization with the intent to acquire your services. The potential client expects your organization to demonstrate the highest degree of assurance possible regarding the operational effectiveness of your risk mitigation process before they are willing to commit to a contractual agreement. What report would be most appropriate to answer the needs of the potential client? SOC 2, Type 1 SOC 2, Type 2 SOC 1, Type 1 SOC 1, Type 2

SOC 2, Type 2 Explanation System and Organization Controls (SOC) 2, Type 2, is a report on organizational technology security controls.

What report would be good for attracting additional clients yet unknown to your business? SOC 1, Type 1 SOC 5, Type 2 SOC 3 SOC 1, Type 2

SOC 3 Explanation System and Organization Controls (SOC) 3 is an executive summary that can be used as a web seal to advertise a summary opinion of technical controls.

Key threats to data storage in the cloud do not include which of the following? Unauthorized access Liability due to regulatory noncompliance SQL injection vulnerability Improper treatment or sanitization after end of use

SQL injection vulnerability Explanation While it is true that SQL injection can lead to the compromise of data, this is more of a common web application vulnerability regardless of where the data is stored. Certain controls (e.g., encryption) might be required to comply with certain regulations. Not all cloud services enable all relevant data controls. Unauthorized access can happen due to hacking, improper data access permissions being assigned in a multitenant environment, or internal cloud service provider employees accessing data without proper authorization. End-of-use is challenging in cloud computing because the physical destruction of media usually cannot be enforced.

Which of the following best describes legislation that was enacted in the United States to protect shareholders and the general public from accounting errors and fraudulent practices in the enterprise? Sarbanes-Oxley Act (SOX) CLOUD Act Gramm-Leach-Bliley Act (GLBA) Health Insurance Portability and Accountability Act (HIPAA)

Sarbanes-Oxley Act (SOX) Explanation SOX was enacted to protect shareholders and the general public from accounting errors and fraudulent practices. HIPAA sets out the requirements of the U.S. Department of Health and Human Services (HHS) to adopt national standards for electronic healthcare transactions and national identifiers for providers, health plans, and employers. The CLOUD Act gives U.S. law enforcement officials broad powers to force U.S.-based technology providers to release data regardless of where the company stores data. GLBA is a federal law enacted in the United States to control how financial institutions deal with individuals' private information.

What are the Trust Services Principles in a SOC 2 report? Confidentiality, processing integrity, and availability Trust, security, and privacy Security, availability, processing integrity, confidentiality, and privacy Trust and security principles

Security, availability, processing integrity, confidentiality, and privacy Explanation A SOC 2 report on controls relevant to security, availability, processing integrity, confidentiality, and privacy is intended to meet the needs of a broad range of users for detailed information and assurance about the above-mentioned areas.

Which of the following needs to be included in the SLA? 1. Service availability, data access, data location 2. Change management processes, software, data portability 3. Exit strategy, disaster recovery processes, environmental design 4. Network, controls, change management processes

Service availability, data access, data location Explanation These three should be included in the SLA.

Step-up authentication is an additional factor or procedure that validates a user's identity, normally prompted by high-risk transactions or violations according to policy rules. Which of the following is not typically among the common methods used to achieve this? Challenge questions Dynamic knowledge-based authentication (questions unique to the end user) Single sign-on Out-of-band authentication (e.g. authentication app)

Single sign-on Explanation Challenge questions, out-of-band authentication, and dynamic knowledge-based authentication are all among the common methods.

Which of the following tools' primary objective is to gain visibility into the open-source inventory (open-source components used in a software product) and help automate the process for open-source management? These tools can help with both security and license management. Dynamic Analysis Security Testing (DAST) tools Software Composition Analysis (SCA) tools Interactive Analysis Security Testing (IAST) Tools Static Analysis Security Testing (SAST) tools

Software Composition Analysis (SCA) tools Explanation The objective described in the question is best met using SCA tools. SAST solutions analyze an application from the inside out in a nonrunning state. Performing SAST does not require the software to be running. SAST does not identify security vulnerabilities of the run-time infrastructure. DAST aims to identify potential security vulnerabilities of web applications and their infrastructure. IAST relies on a combination of SAST and DAST to achieve its goal through its visibility into both the source code and the execution flow during the runtime.

Which NIST publication is focused on developing, implementing, and maintaining effective log management practices in the enterprise? Special Publication 800-92 FIPS 199 Special Publication 800-88 ISO/IEC 15408 (Common Criteria)

Special Publication 800-92 Explanation The development, implementation, and maintenance of effective log management practices is the focus of NIST Special Publication 800-92, "Guide to Computer Security Log Management." Common Criteria is not a NIST publication. Although NIST publishes FIPS 199, its focus remains on the security categorization of information and information systems. Special Publication 800-88 is focused on media sanitization.

A database administrator has been tasked with removing sensitive details from a production database to be used in a test environment. This recommendation comes from the security officer due to concerns about data leakage. What technique should be recommended? 1. Dynamic masking 2. Static masking 3. Random substitution 4. Algorithmic substitution

Static masking Explanation In static masking, a new copy of the data is created with the masked values. Static masking is typically efficient when creating clean, nonproduction environments. Dynamic masking is efficient when protecting production environments; in other words, dynamic masking can hide the full credit card number from customer service representatives, but the data remains available for processing. With random substitution, values are randomly replaced, which may not make this the best approach for the described scenario. With algorithmic substitution, values are replaced or appended with algorithm-generated values, which may not make this the best approach for the described scenario.

As part of the data classification process for your organization, you must determine the data types that the organization may have in its possession. These data types are: Encrypted, unencrypted, and obfuscated data Databases, documents, and spreadsheets Classified and unclassified data Structured, unstructured, and semi-structured data

Structured, unstructured, and semi-structured data Explanation Structured data types that are centrally managed allow better control of enterprise data. They are often accessed and managed using a Structured Query Language (SQL). Unstructured data does not have a predefined data model or is not organized in a predefined manner. This type of data is typically text-heavy but may also have embedded dates, numbers, and facts, as well. Data may also be semistructured, not centrally managed in the database, but with some structure to it.

Which of the following best describes tokenization? 1. Using specific characters to hide certain parts of the sensitive data 2. Transforming an input (e.g., message) into a fixed-length value algorithmically would be useful in addressing the integrity of stored and transmitted information 3. Substituting a sensitive data element with a nonsensitive equivalent that is then mapped back to the original data 4. Encoding data from plaintext to ciphertext, hence making the data unintelligible

Substituting a sensitive data element with a nonsensitive equivalent that is then mapped back to the original data Explanation 'Using specific characters to hide certain parts of the sensitive data' best describes data masking. 'Transforming an input (e.g., message) into a fixed-length value algorithmically that would be useful in addressing the integrity of stored and transmitted information' best describes hashing. 'Encoding data from plaintext to ciphertext, hence, making the data unintelligible' best describes encryption.

A company has decided to categorize or rank its suppliers on some scale to appropriately manage the relationship with each supplier. One of its suppliers supplements other suppliers to manage emerging unforeseen issues and incidents. The company is likely to categorize this supplier as what? Strategic supplier Operational supplier Tactical supplier Commodity supplier

Tactical supplier Explanation Tactical suppliers supplement strategic and commodity suppliers to manage emerging unforeseen issues and incidents. Strategic suppliers are deemed mission-critical and can only be easily replaced if available. While companies typically do business with a few of these types of partners, they are the most crucial to the success or failure of the enterprise cloud architecture. In contrast to strategic suppliers, commodity suppliers provide goods and services that can easily be replaced and sourced from various suppliers if necessary.

Which of the following is true about object storage? Object storage is most suitable for files that change frequently, such as operating system files Technically, object storage can implement redundancy to improve resilience by dispersing data by fragmenting and duplicating it across multiple object-storage The use cases for object storage are files that change often Object storage is a virtual hard drive that can be attached to a VM instance to host data within a file system

Technically, object storage can implement redundancy to improve resilience by dispersing data by fragmenting and duplicating it across multiple object-storage Explanation 'Object storage can implement redundancy to improve resilience by dispersing data by fragmenting and duplicating it across multiple object-storage' is true. Videos, image archives, and files that do not change frequently represent a good use case for object storage. Volume storage is most suitable for files that change frequently, like operating system files. A virtual hard drive that can be attached to a VM instance to host data within a file system best suits volume storage.

Which of the following statements is correct about DevOps? SCRUM is an implementation of DevOps Technological aspects of DevOps are primarily about continuous integration/continuous delivery (CI/CD) practices Adoption of DevOps is only possible when using cloud computational infrastructure and resources To adopt DevOps, security practices must be left out of the SDLC

Technological aspects of DevOps are primarily about continuous integration/continuous delivery (CI/CD) practices Explanation Automation of activities in the software development life cycle is a focus of CI/CD for frequent delivery of software.

As part of an audit, an organization may be required to prove the effectiveness of the controls that are part of its cloud service offerings. Which of the following would be more likely to be used as a control framework for this purpose? The Cloud Controls Matrix (CCM) Payment Card Industry Data Security Standard (PCI DSS) ISO/IEC 12207 NIST SP 800-88

The Cloud Controls Matrix (CCM) Explanation The Cloud Controls Matrix (CCM) by Cloud Security Alliance (CSA) is an example of a framework that can be used to determine applicable controls and their effectiveness. This framework's elements are specific to the cloud. PCI DSS is an industry-standard that is developed to enhance payment account data security. It applies to all organizations storing, transmitting, or processing payment card information. ISO/IEC 12207 provides a framework around lifecycle processes. NIST SP 800-88 provides guidelines for media sanitization.

An organization is currently receiving services from a cloud provider that has been performing unsatisfactorily. Because there was lack of due diligence in matching business requirements to service provisioning, the organization will need to remain with the provider for another 12 months. When the organization reflects on the dissatisfaction and recalls the meetings where there was a disagreement about the service provided versus the service received, it typically involves a system that was functioning versus a service it was not able to consume. Where should the organization's focus be in the SLA? Actually, the contractual agreement is where the problem lies The organization should focus on uptimes that are less than 99.9997% The organization should make a distinction between service interoperability and portability The organization should review uptime and availability

The organization should review uptime and availability Explanation Uptime could mean that a system is running, while availability means a service can be consumed on that system as it runs.

A cloud application consists of two software bundles, often called the server-side stack and the client-side stack. Which of the following are NOT among the major building blocks of a server-side stack in a cloud application? The web server The web browser The database The web framework

The web browser Explanation Web servers, web frameworks, and databases are common building blocks.

'This threat usually pertains to portable storage but may be a consideration when deciding which devices or types of devices are permitted to access data within a cloud service.' The above statement is a definition of which of the following terms? 1. Liability due to regulatory noncompliance 2. Theft or accidental loss of media 3. Unauthorized access 4. Unauthorized usage

Theft or accidental loss of media Explanation Theft or accidental loss of media is a threat that usually pertains to portable storage but may be a consideration when deciding which devices or types of devices are permitted to access data within a cloud service.

When thinking about MITRE ATT&CK, what is resource development? These are actions taken by the attacker that are unique or more significant to mobile device architectures and mobile systems This is specific to ICS, in which out-of-limits conditions, equipment or software failures, hazards, or other issues require immediate intervention by safety, quality assurance, asset protection, or human operator control activities This is unique to the enterprise layer and an indicator of the growth in operational, technical, and financial sophistication and capability that many APTs demonstrate These are techniques that attackers use to gain ongoing access to systems, endpoints, accounts, and other assets within the target's systems and infrastructure

This is unique to the enterprise layer and an indicator of the growth in operational, technical, and financial sophistication and capability that many APTs demonstrate Explanation Resource development is unique to the enterprise layer and an indicator of the growth in operational, technical, and financial sophistication and capability that many APTs demonstrate.

What is the main objective of network function virtualization (NFV)? Expand functions such as firewall management, intrusion detection, network address translation, and name service resolution using specific hardware implementation Limit functions such as firewall management, intrusion detection, network address translation, and name service resolution utilizing specific hardware implementation To couple functions such as firewall management, intrusion detection, network address translation, and name service resolution with specific hardware implementation To decouple functions such as firewall management, intrusion detection, network address translation, and name service resolution apart from specific hardware implementation

To decouple functions such as firewall management, intrusion detection, network address translation, and name service resolution apart from specific hardware implementation Explanation The objective of NFV is decoupling functions such as firewall management, intrusion detection, network address translation, and name service resolution apart from specific hardware implementation. NFV's focus is to optimize distinct network services.

Tort law serves four objectives. Which of the following is not one of those objectives? To define conduct prohibited by the government and to protect the safety and well-being of the public To discourage injurious, careless, and risky behavior in the future To compensate victims for injuries suffered by the culpable action or inaction of others To shift the cost of injuries to the person or persons who are legally responsible for inflicting them

To define conduct prohibited by the government and to protect the safety and well-being of the public Explanation Criminal law defines conduct prohibited by the government and protects the safety and well-being of the public.

Several cloud customers were affected by a data breach from a cloud service provider (CSP), and malicious actors have used their credit card details to commit fraud. A court of law has identified a lack of due care and due diligence by the CSP. The affected customers seek remedies in a class action against the CSP. What type of law applies in this case? GDPR Tort law The CLOUD Act Criminal law

Tort law Explanation Tort law encompasses a body of rights, obligations, and remedies that set out reliefs for persons suffering harm because of the wrongful acts of others. Cases are built upon a preponderance of evidence of damage. Negligence or lack of due care and due diligence, which a reasonable person would show, may also be a factor.

What is the difference between a Type 1 and a Type 2 SOC report? Type 1 is developed over time, and Type 2 is a snapshot Type 1 is concerned with control design, and Type 2 is concerned with control effectiveness There are no Type 1 or 2 reports Type 1 is longer than Type 2

Type 1 is concerned with control design, and Type 2 is concerned with control effectiveness Explanation Type 1 is concerned with control design and implementation. Type 2 is concerned with control effectiveness.

Which of the following terms is installed after a traditional operating system and supports other guest operating systems running above it as VMs and is completely dependent on the host operating system for its operations? Snapshot Trusted Platform Module (TPM) Type 1 Hypervisor Type 2 Hypervisor

Type 2 Hypervisor Explanation Type 2 hypervisor is installed after a traditional operating system and supports other guest operating systems running above it as VMs. Trusted Platform Module (TPM) is an isolated and separate compute and storage unit that enables trust in computing platforms by employing security and privacy techniques using cryptography. Snapshot is the state of an instance at a point in time. Type 1 hypervisor is commonly known as a bare metal, embedded, or native hypervisor.

Application programming interfaces (APIs) are a means for exposing functionality to applications. Which of the following is not considered a benefit of APIs? Typically use a publicly accessible IP address Programmatic control and access Automation Integration with third-party tools

Typically use a publicly accessible IP address Explanation While using a publicly accessible IP address makes authorized access easier, it also makes unauthorized access easier and potentially expands the attack surface.

The company is moving an application and corresponding data to the cloud. Which of the following is the most important activity from a risk management perspective? Define how authorization and authentication reduce spoofing, tampering and repudiation risks Understand how the cloud provider environment adds new risk to application continuity Define the extent to which the cloud provider is responsible for the security of the application code Define how API tokens will be rotated

Understand how the cloud provider environment adds new risk to application continuity Explanation The cloud provider environment adds multitenancy and third-party admins as new risks.

Two of the most important aspects of cloud for the consumer to understand are the legal and compliance requirements, including how each is addressed by the cloud environment. Which of the following is a requirement directly related to legal and compliance? Understand the requirements related to addressing risk assessment and risk management Understand the potential personal and data privacy issues specific to personally identifiable information (PII) within the cloud environment Understand the forensic challenges related to cloud environments Understand the requirements of data access within the different service models

Understand the potential personal and data privacy issues specific to personally identifiable information (PII) within the cloud environment Explanation Personally identifiable information (PII) data has legal and compliance implications throughout the world.

'Aligns to requirements, describes user-focused scenarios.' The above statement best describes which of the following statements? Functional Testing Use Case Testing Abuse Case Testing Nonfunctional Testing

Use Case Testing Explanation Use Case Testing aligns with requirements, describing user-focused scenarios that represent what a system does when the system is used in the ways in which it was designed to be used.

Black-box testing of software applications can be best described as: Validating the application under test against its requirements considering the inputs and expected outputs, regardless of how the inputs are transformed into outputs Testing the application while remaining most concerned with validating how the business logic of the application is implemented Shifting the responsibility for application security from developers to the quality assurance (QA) team Analyzing an application from the inside out in a nonrunning state

Validating the application under test against its requirements considering the inputs and expected outputs, regardless of how the inputs are transformed into outputs Explanation Black-box testing remains focused on testing the behavior of the application under test.

Which of the following is not true about vendor lock-in? Vendor lock-in risk may vary from service to service as well as from provider to provider One risk of vendor lock-in is that vendor services might fail to meet changing business needs Vendor lock-in can occur for financial reasons Vendor lock-in risk may vary from service to service as well as from provider to provider.

Vendor lock-in risk may vary from service to service as well as from provider to provider Explanation Vendor lock-in can occur for functional reasons; services provided may not be available elsewhere.

An organization has acquired several companies that have archaic applications critical to revenue generation. The Board has decided that these applications must be supported for the near future. In many cases, these applications have hardcoded references to disk partitions and logical unit numbers (LUNs). As the technology team begins to formulate an architecture for hosting storage generated from these applications in the cloud, they must consider the appropriate storage type. What is the most appropriate type of storage to select? Tier 1 Object Volume Tier 2

Volume Explanation Volume storage is written to committed blocks of data. Volumes attached to IaaS instances behave just like a physical drive or an array does.

The Health Insurance Portability and Accountability Act (HIPAA) breach notification rule requires HIPAA-covered entities and their business associates to provide notification following a breach of unsecured protected health information. Individual notifications must be provided: Without unreasonable delay, and in no case later than 60 days following the discovery of a breach Without unreasonable delay and in no case later than 72 hours following the discovery of a breach Notifications are not needed Without unreasonable delay, but at the discretion of the breached party within one year

Without unreasonable delay, and in no case later than 60 days following the discovery of a breach Explanation Individual notifications must be provided without unreasonable delay and in no case later than 60 days following the discovery of a breach and must include, to the extent possible, a brief description of the breach, a description of the types of information that were involved in the breach; the steps affected individuals should take to protect themselves from potential harm, a brief description of what the covered entity is doing to investigate the breach, mitigate the harm, and prevent further breaches, as well as contact information for the covered entity (or business associate, as applicable).

Which of the following is not true about Extreme Programming (XP)? XP is best described as a variation of the Waterfall model XP promotes collective ownership XP is a lightweight methodology best suited for developing software when the requirements are vague or tend to change frequently XP practices include Pair Programming

XP is best described as a variation of the Waterfall model Explanation XP is an Agile implementation and well suited for situations where requirements are vague or tend to change frequently. Pair Programming is an XP practice. XP promotes collective ownership and places the responsibility for the whole system on the team as opposed to individuals.


Ensembles d'études connexes

21 - Questions - WAN Technologies

View Set

APUSH Chapter 26 Triumph of the Middle Class

View Set

LT Veiksmažodžiai / Verbs - Būtatis kartinis Laikas / Preterite Tense )

View Set

Chapter 26: Growth and Development of the Toddler

View Set

Taxation 2 Pre-Assessment 9.15.19

View Set

Renin-Angiotensin-Aldosterone System (RAAS)

View Set

TOGAF Multiple Choice Consolidated

View Set