GCP- Security

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

Cymbal Bank uses Docker containers to interact with APIs for its personal banking application. These APIs are under PCI-DSS compliance. The Kubernetes environment running the containers will not have internet access to download required packages. How would you automate the pipeline that is building these containers?

- Create a Dockerfile with a container definition and a Cloud Build configuration file. - Use the Cloud Build configuration file to build and deploy the image from Dockerfile to a Google Container registry. - In the configuration file, include the Google Container Registry path and the Google Kubernetes Engine cluster. - Upload the configuration file to a Git repository. - Create a trigger in Cloud Build to automate the deployment using the Git repository Cloud Build can help build a container image from a Dockerfile and upload it to Google Cloud Registry (GCR). Then you can create a configuration file mapping container image path in GCR to Google Kubernetes Engine (GKE) cluster. Upload this configuration file to a Git repository and use it to create a Cloud Build trigger. Whenever you update the Git repository with a new build and source, Cloud Build will update the GKE cluster.

ymbal Bank uses Compute Engine instances for its APIs, and recently discovered bitcoin mining activities on some instances. The bank wants to detect all future mining attempts and notify the security team. The security team can view the Security Command Center and Cloud Audit Logs. 4.2 How should you configure the detection and notification?

.Enable Anomaly Detection in the Security Command Center. Create and configure a Pub/Sub topic and an email service. Create a Cloud Function to send email notifications for suspect activities. Export findings to a Pub/Sub topic, and use them to invoke the Cloud Function You should enable Anomaly Detection in the Security Command Center. Use Pub/Sub topics to export the findings. Cloud Functions can then be used to send out any form of notification

Cymbal Bank plans to launch a new public website where customers can pay their equated monthly installments (EMI) using credit cards. You need to build a secure payment processing solution using Google Cloud which should follow the PCI-DSS isolation requirements. How would you architect a secure payment processing environment with Google Cloud services to follow PCI-DSS? Select the two correct choices

1. Create a new Google Cloud project with restricted access (separate from production environment) for the payment processing solution. Configure firewall rules, a VPN tunnel, and an HTTP(S) load balancer for a new Compute Engine instance You need an isolated Linux base (Compute Engine) environment that is separate from your production environment as per PCI-DSS isolation requirements for payment solutions. 2. Deploy a Linux base image from preconfigured operating system images. Install only the libraries you need. Deploy using Terraform. Having a minimalist operating system with only the libraries required for your application limits the attack surface. Use Terraform to ensure that only the current deployment happens, without interruption.

An organization adopts Google Cloud Platform (GCP) for application hosting services and needs guidance on setting up password requirements for their CloudIdentity account. The organization has a password policy requirement that corporate employee passwords must have a minimum number of characters.Which Cloud Identity password guidelines can the organization use to inform their new requirements? A. Set the minimum length for passwords to be 8 characters. B. Set the minimum length for passwords to be 10 characters. C. Set the minimum length for passwords to be 12 characters. D. Set the minimum length for passwords to be 6 characters. Reveal Solution Discussion 21

A

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its current data backup and disaster recovery solutions to GCP for later analysis. The organization's production environment will remain on- premises for an indefinite time. The organization wants a scalable and cost-efficient solution.Which GCP solution should the organization use? A. BigQuery using a data pipeline job with continuous updates B. Cloud Storage using a scheduled task and gsutil C. Compute Engine Virtual Machines using Persistent Disk D. Cloud Datastore using regularly scheduled batch upload jobs

A

You are creating an internal App Engine application that needs to access a user's Google Drive on the user's behalf. Your company does not want to rely on the current user's credentials. It also wants to follow Google-recommended practices.What should you do? A. Create a new Service account, and give all application users the role of Service Account User. B. Create a new Service account, and add all application users to a Google Group. Give this group the role of Service Account User. C. Use a dedicated G Suite Admin account, and authenticate the application's operations with these G Suite credentials. D. Create a new service account, and grant it G Suite domain-wide delegation. Have the application use it to impersonate the user. Reveal Solution Discussion 11

A

You need to follow Google-recommended practices to leverage envelope encryption and encrypt data at the application layer.What should you do? A. Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the encrypted DEK. B. Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the KEK. C. Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the encrypted DEK. D. Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the KEK.

A

Your company is using Cloud Dataproc for its Spark and Hadoop jobs. You want to be able to create, rotate, and destroy symmetric encryption keys used for the persistent disks used by Cloud Dataproc. Keys can be stored in the cloud.What should you do? A. Use the Cloud Key Management Service to manage the data encryption key (DEK). B. Use the Cloud Key Management Service to manage the key encryption key (KEK). C. Use customer-supplied encryption keys to manage the data encryption key (DEK). D. Use customer-supplied encryption keys to manage the key encryption key (KEK).

A

A customer implements Cloud Identity-Aware Proxy for their ERP system hosted on Compute Engine. Their security team wants to add a security layer so that theERP systems only accept traffic from Cloud Identity-Aware Proxy.What should the customer do to meet these requirements? A. Make sure that the ERP system can validate the JWT assertion in the HTTP requests. B. Make sure that the ERP system can validate the identity headers in the HTTP requests. C. Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests. D. Make sure that the ERP system can validate the user's unique identifier headers in the HTTP requests.

A Use Cryptographic Verification If there is a risk of IAP being turned off or bypassed, your app can check to make sure the identity information it receives is valid. This uses a third web request header added by IAP, called X-Goog-IAP-JWT-Assertion. The value of the header is a cryptographically signed object that also contains the user identity data. Your application can verify the digital signature and use the data provided in this object to be certain that it was provided by IAP without alteration. https://cloud.google.com/iap/docs/signed-headers-howto

A customer needs to launch a 3-tier internal web application on Google Cloud Platform (GCP). The customer's internal compliance requirements dictate that end- user access may only be allowed if the traffic seems to originate from a specific known good CIDR. The customer accepts the risk that their application will only have SYN flood DDoS protection. They want to use GCP's native SYN flood protection.Which product should be used to meet these requirements? A. Cloud Armor B. VPC Firewall Rules C. Cloud Identity and Access Management D. Cloud CDN

A https://cloud.google.com/blog/products/identity-security/understanding-google-cloud-armors-new-waf-capabilities

Your team needs to configure their Google Cloud Platform (GCP) environment so they can centralize the control over networking resources like firewall rules, subnets, and routes. They also have an on-premises environment where resources need access back to the GCP resources through a private VPN connection.The networking resources will need to be controlled by the network security team.Which type of networking design should your team use to meet these requirements? A. Shared VPC Network with a host project and service projects B. Grant Compute Admin role to the networking team for each engineering project C. VPC peering between all engineering projects using a hub and spoke model D. Cloud VPN Gateway between all engineering projects using a hub and spoke model

A https://cloud.google.com/docs/enterprise/best-practices-for-enterprise-organizations#centralize_network_control

A customer's company has multiple business units. Each business unit operates independently, and each has their own engineering group. Your team wants visibility into all projects created within the company and wants to organize their Google Cloud Platform (GCP) projects based on different business units. Each business unit also requires separate sets of IAM permissions.Which strategy should you use to meet these needs? A. Create an organization node, and assign folders for each business unit. B. Establish standalone projects for each business unit, using gmail.com accounts. C. Assign GCP resources in a project, with a label identifying which business unit owns the resource. D. Assign GCP resources in a VPC for each business unit to separate network access.

A https://cloud.google.com/resource-manager/docs/listing-all-resources Also: https://wideops.com/mapping-your-organization-with-the-google-cloud-platform-resource-hierarchy/

Your team needs to make sure that a Compute Engine instance does not have access to the internet or to any Google APIs or services.Which two settings must remain disabled to meet these requirements? (Choose two.) A. Public IP B. IP Forwarding C. Private Google Access Most Voted D. Static routes E. IAM Network User Role

A - C

Which two implied firewall rules are defined on a VPC network? (Choose two.) A. A rule that allows all outbound connections B. A rule that denies all inbound connections C. A rule that blocks all inbound port 25 connections D. A rule that blocks all outbound connections E. A rule that allows all inbound port 80 connections

A B

In order to meet PCI DSS requirements, a customer wants to ensure that all outbound traffic is authorized.Which two cloud offerings meet this requirement without additional compensating controls? (Choose two.) A. App Engine B. Cloud Functions C. Compute Engine D. Google Kubernetes Engine E. Cloud Storage

A C https://cloud.google.com/solutions/pci-dss-compliance-in-gcp

A company is running workloads in a dedicated server room. They must only be accessed from within the private company network. You need to connect to these workloads from Compute Engine instances within a Google Cloud Platform project.Which two approaches can you take to meet the requirements? (Choose two.) A. Configure the project with Cloud VPN. Most Voted B. Configure the project with Shared VPC. C. Configure the project with Cloud Interconnect. Most Voted D. Configure the project with VPC peering. E. Configure all Compute Engine instances with Private Access.

AC

The data from Cymbal Bank's loan applicants resides in a shared VPC. A credit analysis team uses a CRM tool hosted in the App Engine standard environment. You need to provide credit analysts with access to this data. You want the charges to be incurred by the credit analysis team. What should you do?

Add ingress firewall rules to allow NAT and Health Check ranges for the App Engine standard environment in the Shared VPC network. Create a client-side connector in the Service Project using the Shared VPC Project ID. Verify that the connector is in a READY state. Create an ingress rule on the Shared VPC network to allow the connector using Network Tags or IP ranges App Engine uses a fixed set of NAT and health check IP address ranges that must be permitted into the VPC. Because the charges must be incurred by the credit analysis team, you need to create the connector on the client side.

Cymbal Bank's organizational hierarchy divides the Organization into departments. The Engineering Department has a 'product team' folder. This folder contains folders for each of the bank's products. Each product folder contains one Google Cloud Project, but more may be added. Each project contains an App Engine deployment. Cymbal Bank has hired a new technical product manager and a new web developer. The technical product manager must be able to interact with and manage all services in projects that roll up to the Engineering Department folder. The web developer needs read-only access to App Engine configurations and settings for a specific product.

Assign the Project Editor role at the Engineering Department folder level to the technical product manager. Assign the App Engine Deployer role at the specific product's folder level to the web developer Because the technical product manager must be able to work with services across all projects, you should provide permissions at the Department folder level. The web developer should only be able to administer App Engine deployments in their product folder.

Cymbal Bank's organizational hierarchy divides the Organization into departments. The Engineering Department has a 'product team' folder. This folder contains folders for each of the bank's products. One folder titled "analytics" contains a Google Cloud Project that contains an App Engine deployment and a Cloud SQL instance. A team needs specific access to this project. The team lead needs full administrative access to App Engine and Cloud SQL. A developer must be able to configure and manage all aspects of App Engine deployments. There is also a code reviewer who may periodically review the deployed App Engine source code without making any changes.

Assign the basic 'App Engine Admin' and 'Cloud SQL Admin" roles to the team lead. Assign the 'App Engine Admin' role to the developer. Assign the 'App Engine Code Viewer' role to the code reviewer. Assign all these permissions at the analytics project level. The team lead needs full access to the App Engine and Cloud SQL services. The developer needs to administer App Engine deployments. The 'App Engine Code Viewer' role allows the code reviewer to access deployed source code.

A customer needs an alternative to storing their plain text secrets in their source-code management (SCM) system.How should the customer achieve this using Google Cloud Platform? A. Use Cloud Source Repositories, and store secrets in Cloud SQL. B. Encrypt the secrets with a Customer-Managed Encryption Key (CMEK), and store them in Cloud Storage. C. Run the Cloud Data Loss Prevention API to scan the secrets, and store them in Cloud SQL. D. Deploy the SCM to a Compute Engine VM with local SSDs, and enable preemptible VMs.

B

A customer wants to move their sensitive workloads to a Compute Engine-based cluster using Managed Instance Groups (MIGs). The jobs are bursty and must be completed quickly. They have a requirement to be able to control the key lifecycle.Which boot disk encryption solution should you use on the cluster to meet this customer's requirements? A. Customer-supplied encryption keys (CSEK) B. Customer-managed encryption keys (CMEK) using Cloud Key Management Service (KMS) C. Encryption by default D. Pre-encrypting files before transferring to Google Cloud Platform (GCP) for analysis

B

An employer wants to track how bonus compensations have changed over time to identify employee outliers and correct earning disparities. This task must be performed without exposing the sensitive compensation data for any individual and must be reversible to identify the outlier.Which Cloud Data Loss Prevention API technique should you use to accomplish this? A. Generalization B. Redaction C. CryptoHashConfig D. CryptoReplaceFfxFpeConfig

B

Your team needs to obtain a unified log view of all development cloud projects in your SIEM. The development projects are under the NONPROD organization folder with the test and pre-production projects. The development projects share the ABC-BILLING billing account with the rest of the organization.Which logging export strategy should you use to meet the requirements? A. 1. Export logs to a Cloud Pub/Sub topic with folders/NONPROD parent and includeChildren property set to True in a dedicated SIEM project. 2. Subscribe SIEM to the topic. B. 1. Create a Cloud Storage sink with billingAccounts/ABC-BILLING parent and includeChildren property set to False in a dedicated SIEM project. 2. Process Cloud Storage objects in SIEM. C. 1. Export logs in each dev project to a Cloud Pub/Sub topic in a dedicated SIEM project. 2. Subscribe SIEM to the topic. Most Voted D. 1. Create a Cloud Storage sink with a publicly shared Cloud Storage bucket in each project. 2. Process Cloud Storage objects in SIEM.

B

Your team wants to centrally manage GCP IAM permissions from their on-premises Active Directory Service. Your team wants to manage permissions by AD group membership.What should your team do to meet these requirements? A. Set up Cloud Directory Sync to sync groups, and set IAM permissions on the groups. B. Set up SAML 2.0 Single Sign-On (SSO), and assign IAM permissions to the groups. C. Use the Cloud Identity and Access Management API to create groups and IAM permissions from Active Directory. D. Use the Admin SDK to create groups and assign IAM permissions from Active Directory.

B Correct Answer is A as explained here https://www.udemy.com/course/google-security-engineer-certification/?referralCode=E90E3FF49D9DE15E2855 " In order to be able to keep using the existing identity management system, identities need to be synchronized between AD and GCP IAM. To do so google provides a tool called Cloud Directory Sync. This tool will read all identities in AD and replicate those within GCP. Once the identities have been replicated then it's possible to apply IAM permissions on the groups. After that you will configure SAML so google can act as a service provider and either you ADFS or other third party tools like Ping or Okta will act as the identity provider. This way you effectively delegate the authentication from Google to something that is under your control."

A company migrated their entire data/center to Google Cloud Platform. It is running thousands of instances across multiple projects managed by different departments. You want to have a historical record of what was running in Google Cloud Platform at any point in time.What should you do? A. Use Resource Manager on the organization level. B. Use Forseti Security to automate inventory snapshots. C. Use Stackdriver to create a dashboard across all projects. D. Use Security Command Center to view all assets across the organization.

B https://cloud.google.com/security-command-center/docs/concepts-security-command-center-overview#inventory

An organization's typical network and security review consists of analyzing application transit routes, request handling, and firewall rules. They want to enable their developer teams to deploy new applications without the overhead of this full review.How should you advise this organization? A. Use Forseti with Firewall filters to catch any unwanted configurations in production. B. Mandate use of infrastructure as code and provide static analysis in the CI/CD pipelines to enforce policies. C. Route all VPC traffic through customer-managed routers to detect malicious patterns in production. D. All production applications will run on-premises. Allow developers free rein in GCP as their dev and QA platforms.

B https://www.terraform.io/use-cases/enforce-policy-as-code

When creating a secure container image, which two items should you incorporate into the build if possible? (Choose two.) A. Ensure that the app does not run as PID 1. B. Package a single app as a container. C. Remove any unnecessary tools not needed by the app. D. Use public container images as a base image for the app. E. Use many container image layers to hide sensitive information.

BC

A customer's data science group wants to use Google Cloud Platform (GCP) for their analytics workloads. Company policy dictates that all data must be company-owned and all user authentications must go through their own Security Assertion Markup Language (SAML) 2.0 Identity Provider (IdP). TheInfrastructure Operations Systems Engineer was trying to set up Cloud Identity for the customer and realized that their domain was already being used by G Suite.How should you best advise the Systems Engineer to proceed with the least disruption? A. Contact Google Support and initiate the Domain Contestation Process to use the domain name in your new Cloud Identity domain. B. Register a new domain name, and use that for the new Cloud Identity domain. C. Ask Google to provision the data science manager's account as a Super Administrator in the existing domain. D. Ask customer's management to discover any other uses of Google managed services, and work with the existing Super Administrator.

C

A website design company recently migrated all customer sites to App Engine. Some sites are still in progress and should only be visible to customers and company employees from any location.Which solution will restrict access to the in-progress sites? A. Upload an .htaccess file containing the customer and employee user accounts to App Engine. B. Create an App Engine firewall rule that allows access from the customer and employee networks and denies all other traffic. C. Enable Cloud Identity-Aware Proxy (IAP), and allow access to a Google Group that contains the customer and employee user accounts. D. Use Cloud VPN to create a VPN connection between the relevant on-premises networks and the company's GCP Virtual Private Cloud (VPC) network.

C

An application running on a Compute Engine instance needs to read data from a Cloud Storage bucket. Your team does not allow Cloud Storage buckets to be globally readable and wants to ensure the principle of least privilege.Which option meets the requirement of your team? A. Create a Cloud Storage ACL that allows read-only access from the Compute Engine instance's IP address and allows the application to read from the bucket without credentials. B. Use a service account with read-only access to the Cloud Storage bucket, and store the credentials to the service account in the config of the application on the Compute Engine instance. C. Use a service account with read-only access to the Cloud Storage bucket to retrieve the credentials from the instance metadata. Most Voted D. Encrypt the data in the Cloud Storage bucket using Cloud KMS, and allow the application to decrypt the data with the KMS key. Hide Solution

C

When working with agents in the support center via online chat, your organization's customers often share pictures of their documents with personally identifiable information (PII). Your leadership team is concerned that this PII is being stored as part of the regular chat logs, which are reviewed by internal or external analysts for customer service trends.You want to resolve this concern while still maintaining data utility. What should you do? A. Use Cloud Key Management Service to encrypt PII shared by customers before storing it for analysis. B. Use Object Lifecycle Management to make sure that all chat records containing PII are discarded and not saved for analysis. C. Use the image inspection and redaction actions of the DLP API to redact PII from the images before storing them for analysis. D. Use the generalization and bucketing actions of the DLP API solution to redact PII from the texts before storing them for analysis.

C

You are a member of the security team at an organization. Your team has a single GCP project with credit card payment processing systems alongside web applications and data processing systems. You want to reduce the scope of systems subject to PCI audit standards.What should you do? A. Use multi-factor authentication for admin access to the web application. B. Use only applications certified compliant with PA-DSS. C. Move the cardholder data environment into a separate GCP project. D. Use VPN for all connections between your office and cloud environments. Hide Solution Discussion 7

C

An organization is migrating from their current on-premises productivity software systems to G Suite. Some network security controls were in place that were mandated by a regulatory body in their region for their previous on-premises system. The organization's risk team wants to ensure that network security controls are maintained and effective in G Suite. A security architect supporting this migration has been asked to ensure that network security controls are in place as part of the new shared responsibility model between the organization and Google Cloud.What solution would help meet the requirements? A. Ensure that firewall rules are in place to meet the required controls. B. Set up Cloud Armor to ensure that network security controls can be managed for G Suite. C. Network security is a built-in solution and Google's Cloud responsibility for SaaS products like G Suite. D. Set up an array of Virtual Private Cloud (VPC) networks to control network security as mandated by the relevant regulation.

C GSuite is Saas application. Shared responsibility "Security of the Cloud" - GCP is responsible for protecting the infrastructure that runs all of the services offered in the GCP Cloud. This infrastructure is composed of the hardware, software, networking, and facilities that run GCP Cloud services.

A customer needs to prevent attackers from hijacking their domain/IP and redirecting users to a malicious site through a man-in-the-middle attack.Which solution should this customer use? A. VPC Flow Logs B. Cloud Armor C. DNS Security Extensions D. Cloud Identity-Aware Proxy

C https://cloud.google.com/blog/products/gcp/dnssec-now-available-in-cloud-dns

A business unit at a multinational corporation signs up for GCP and starts moving workloads into GCP. The business unit creates a Cloud Identity domain with an organizational resource that has hundreds of projects.Your team becomes aware of this and wants to take over managing permissions and auditing the domain resources.Which type of access should your team grant to meet this requirement? A. Organization Administrator B. Security Reviewer C. Organization Role Administrator D. Organization Policy Administrator

C https://cloud.google.com/iam/docs/understanding-roles

A company's application is deployed with a user-managed Service Account key. You want to use Google-recommended practices to rotate the key.What should you do? A. Open Cloud Shell and run gcloud iam service-accounts enable-auto-rotate --iam-account=IAM_ACCOUNT. B. Open Cloud Shell and run gcloud iam service-accounts keys rotate --iam-account=IAM_ACCOUNT --key=NEW_KEY. C. Create a new key, and use the new key in the application. Delete the old key from the Service Account. D. Create a new key, and use the new key in the application. Store the old key on the system as a backup key

C https://cloud.google.com/iam/docs/understanding-service-accounts

A company has been running their application on Compute Engine. A bug in the application allowed a malicious user to repeatedly execute a script that results in the Compute Engine instance crashing. Although the bug has been fixed, you want to get notified in case this hack re-occurs.What should you do? A. Create an Alerting Policy in Stackdriver using a Process Health condition, checking that the number of executions of the script remains below the desired threshold. Enable notifications. Most Voted B. Create an Alerting Policy in Stackdriver using the CPU usage metric. Set the threshold to 80% to be notified when the CPU usage goes above this 80%. C. Log every execution of the script to Stackdriver Logging. Create a User-defined metric in Stackdriver Logging on the logs, and create a Stackdriver Dashboard displaying the metric. D. Log every execution of the script to Stackdriver Logging. Configure BigQuery as a log sink, and create a BigQuery scheduled query to count the number of executions in a specific timeframe.

C https://cloud.google.com/logging/docs/logs-based-metrics/

A customer deploys an application to App Engine and needs to check for Open Web Application Security Project (OWASP) vulnerabilities.Which service should be used to accomplish this? A. Cloud Armor B. Google Cloud Audit Logs C. Web Security Scanner D. Anomaly Detection

C https://cloud.google.com/security-scanner/

How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system? A. Send all logs to the SIEM system via an existing protocol such as syslog. B. Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system. C. Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow. D. Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.

C https://cloud.google.com/solutions/exporting-stackdriver-logging-for-splunk

Your team sets up a Shared VPC Network where project co-vpc-prod is the host project. Your team has configured the firewall rules, subnets, and VPN gateway on the host project. They need to enable Engineering Group A to attach a Compute Engine instance to only the 10.1.1.0/24 subnet.What should your team grant to Engineering Group A to meet this requirement? A. Compute Network User Role at the host project level. B. Compute Network User Role at the subnet level. C. Compute Shared VPC Admin Role at the host project level. D. Compute Shared VPC Admin Role at the service project level.

C https://cloud.google.com/vpc/docs/shared-vpc

An ecommerce portal uses Google Kubernetes Engine to deploy its recommendation engine in Docker containers. This cluster instance does not have an external IP address. You need to provide internet access to the pods in the Kubernetes cluster. What configuration would you add? What should you do?

Cloud NAT gateway, subnet primary IP address range for nodes, and subnet secondary IP address range for pods and services in the cluster Cloud NAT gateways help provide internet access (outbound) without requiring a public IP address.

Cymbal Bank needs to migrate existing loan processing applications to Google Cloud. These applications transform confidential financial information. All the data should be encrypted at all stages, including sharing between sockets and RAM. An integrity test should also be performed every time these instances boot. You need to use Cymbal Bank's encryption keys to configure the Compute Engine instances. What should you do?

Create a Confidential VM instance with Customer-Supplied Encryption Keys. In Cloud Logging, collect all logs for sevLaunchAttestationReportEvent. Use Customer-Supplied Encryption Keys because you need to use your own encryption keys. Confidential VMs have a unique launch attestation event that can be read from Cloud Logging.

Cymbal Bank is releasing a new loan management application using a Compute Engine managed instance group. External users will connect to the application using a domain name or IP address protected with TLS 1.2. A load balancer already hosts this application and preserves the source IP address. You are tasked with setting up the SSL certificate for this load balancer. What should you do?

Create a Google-managed SSL certificate. Attach a global static external IP address to the external HTTPS load balancer. Validate that an existing URL map will route the incoming service to your managed instance group backend. Load your certificate and create an HTTPS proxy routing to your URL map. Create a global forwarding rule that routes incoming requests to the proxy Attaching a global static external IP address will expose your load balancer to internet users. Creating HTTPS proxy (and global forwarding rules) will help route the request to the existing backend.

Cymbal Bank uses Google Kubernetes Engine (GKE) to deploy its Docker containers. You want to encrypt the boot disk for a cluster running a custom image so that the key rotation is controlled by the Bank. GKE clusters will also generate up to 1024 randomized characters that will be used with the keys with Docker containers. What steps would you take to apply the encryption settings with a dedicated hardware security layer?

Create a new GKE cluster with customer-managed encryption and HSM enabled. Deploy the containers to this cluster. Delete the old GKE cluster. Use Cloud HSM to generate random bytes and provide an additional layer of security. Building a new cluster and deleting the old one is the solution. Cloud HSM provides an additional layer of dedicated hardware security and generates random bytes of up to 1024 characters.

Cymbal Bank leverages Google Cloud storage services, an on-premises Apache Spark Cluster, and a web application hosted on a third-party cloud. The Spark cluster and web application require limited access to Cloud Storage buckets and a Cloud SQL instance for only a few hours per day. You have been tasked with sharing credentials while minimizing the risk that the credentials will be compromised.

Create a service account with appropriate permissions. Have the Spark Cluster and the web application authenticate as delegated requests, and share the short-lived service account credential as a JWT. Delegated requests allow a service account to authenticate into a chain of services. Using short-lived service account credentials provides limited access to trusted services.

Cymbal Bank is divided into separate departments. Each department is divided into teams. Each team works on a distinct product that requires Google Cloud resources for development. How would you design a Google Cloud organization hierarchy to best match Cymbal Bank's organization structure and needs?

Create an Organization node. Under the Organization node, create Department folders. Under each Department, create a Teams folder. Under each Team, create Product folders. Add Projects to the Product folders. Departments have teams, which work on products. This hierarchy best fits Cymbal Bank's organization structure.

Cymbal Bank stores customer information in a BigQuery table called 'Information,' which belongs to the dataset 'Customers.' Various departments of Cymbal Bank, including loan, credit card, and trading, access the information table. Although the data source remains the same, each department needs to read and analyze separate customers and customer-attributes. You want a cost-effective way to configure departmental access to BigQuery to provide optimal performance. What should you do?

Create separate datasets for each department. Create views for each dataset separately. Authorize these views to access the source dataset. Share the datasets with departments. Provide the bigquery.dataViewer role to each department's required users Using authorized views is the right approach. Create a separate dataset for each department, and provide access to views containing filtered rows and columns.

Cymbal Bank needs to connect its employee MongoDB database to a new human resources web application on the same network. Both the database and the application are autoscaled with the help of Instance templates. As the Security Administrator and Project Editor, you have been tasked with allowing the application to read port 27017 on the database. What should you do?

Create service accounts for the application and database. Create a firewall rule using: gcloud compute firewall-rules create ALLOW_MONGO_DB --network network-name --allow TCP:27017 --source-service-accounts web-application-service-account --target-service-accounts database-service-account Use service accounts to automate the identification, authentication, and authorization process between the n-tier services. Allow TCP protocol on the port for reading.

A company has redundant mail servers in different Google Cloud Platform regions and wants to route customers to the nearest mail server based on location.How should the company accomplish this? A. Configure TCP Proxy Load Balancing as a global load balancing service listening on port 995. B. Create a Network Load Balancer to listen on TCP port 995 with a forwarding rule to forward traffic based on location. C. Use Cross-Region Load Balancing with an HTTP(S) load balancer to route traffic to the nearest region. D. Use Cloud CDN to route the mail traffic to the closest origin mail server based on client IP address.

D

Cymbal Bank has a team of developers and administrators working on different sets of Google Cloud resources. The Bank's administrators should be able to access the serial ports on Compute Engine Instances and create service accounts. Developers should only be able to access serial ports. How would you design the organization hierarchy to provide the required access?

Deny Serial Port Access and Service Account Creation at the organization level. Create a 'dev' folder and set enforced: false for constraints/compute.disableSerialPortAccess. Create a new 'admin' folder inside the 'dev' folder, and set enforced: false for constraints/iam.disableServiceAccountCreation. Give developers access to the 'dev' folder, and administrators access to the 'admin' folder. These organizational constraints will prevent all users from accessing serial ports on Compute Engine instances and creating service accounts. You can override these constraints in a new folder by setting the common constraint for serial port access. Creating another folder inside a parent folder will allow you to inherit the constraint and will allow you to add additional constraints to create a service account. Admins and developers are added appropriately.

Cymbal Bank wants to use Cloud Storage and BigQuery to store safe deposit usage data. Cymbal Bank needs a cost-effective approach to auditing only Cloud Storage and BigQuery data access activities. 4.2 How would you use Cloud Audit Logs to enable this analysis?

Enable Data Access Logs for ADMIN_READ, DATA_READ, and DATA_WRITE for Cloud Storage. All Data Access Logs are enabled for BigQuery by default. Cloud Storage must be configured for Cloud Audit Logging, but BigQuery is enabled by default.

Cymbal Bank needs to statistically predict the days customers delay the payments for loan repayments and credit card repayments. Cymbal Bank does not want to share the exact dates a customer has defaulted or made a payment with data analysts. Additionally, you need to hide the customer name and the customer type, which could be corporate or retail. How do you provide the appropriate information to the data analysts?

Generalize all dates to year and month with date shifting. Use a predefined infoType for customer name. Use a custom infoType for customer type with a custom dictionary. If your data is stored in a valid schema, date shifting will shift all dates logically. Built-in infoTypes allow a range of locale-specific and globally identifiable sensitive information pieces like email IDs and phone numbers. Custom dictionaries can be used with a custom infoType that contains predefined key-value pairs.

Cymbal Bank's lending department stores sensitive information, such as your customers' credit history, address and phone number, in parquet files. You need to upload this personally identifiable information (PII) to Cloud Storage so that it's secure and compliant with ISO 27018. How should you protect this sensitive information using Cymbal Bank's encryption keys and using the least amount of computational resources?

Generate an AES-256 key as a 32-byte bytestring. Decode it as a base-64 string. Upload the blob to the bucket using this key. You should use a customer-supplied encryption key (CSEK) to protect sensitive information. AES-256 encryption returns a 32-byte bytestring that needs to be decoded.

You are designing a web application for Cymbal Bank so that customers who have credit card issues can contact dedicated support agents. Customers may enter their complete credit card number when chatting with or emailing support agents. You want to ensure compliance with PCI-DSS and prevent support agents from viewing this information in the most cost-effective way. What should you do?

Implement Cloud Data Loss Prevention using its REST API. Cloud DLP helps with identifying sensitive information along with its INFOTYPE. Cloud DLP can then mask sensitive information programmatically.

You are an administrator for Cymbal Bank's Mobile Development Team. You want to control how long different users can access the Google Cloud console, the Cloud SDK, and any applications that require user authorization for Google Cloud scopes without having to reauthenticate. More specifically, you want users with elevated privileges (project owners and billing administrators) to reauthenticate more frequently than regular users at the organization level.

In the Admin console, select Google Cloud session control and set a reauthentication policy that requires reauthentication. Choose the reauthentication frequency from the drop-down list. Session control settings are configured in the Admin console. These settings will be set at the organization level and will include all project owners and billing administrators in the organization.

Cymbal Bank recently discovered service account key misuse in one of the teams during a security audit. As a precaution, going forward you do not want any team in your organization to generate new external service account keys. You also want to restrict every new service account's usage to its associated Project.

Navigate to Organizational policies in the Google Cloud Console. Select your organization. Select iam.disableServiceAccountKeyCreation. Customize the applied to property, and set Enforcement to 'On'. Click Save. Repeat the process for iam.disableCrossProjectServiceAccountUsage. Boolean constraints help you limit service account usage. iam.disableServiceAccountKeyCreation will restrict the creation of new external service account keys. iam.disableCrossProjectServiceAccountUsage will prevent service accounts from being attached to resources in other projects.

Cymbal Bank has received Docker source files from its third-party developers in an Artifact Registry repository. These Docker fil2es will be part of a CI/CD pipeline to update Cymbal Bank's personal loan offering. The bank wants to prevent the possibility of remote users arbitrarily using the Docker files to run any code. You have been tasked with using Container Analysis' On-Demand scanning to scan the images for a one-time update. What should you do?

Prepare a cloudbuild.yaml file. In this file, add four steps in order—build, scan, severity check, and push—specifying the location of Artifact Registry repository. Specify severity level as CRITICAL. Start the build with the command gcloud builds submit. Scanning requires you to build the images, scan them, check for severity, and push to the registry, in that order. Severity level required is CRITICAL to disable remote users from running code using images.

Cymbal Bank runs a Node.js application on a Compute Engine instance. Cymbal Bank needs to share this base image with a 'development' Google Group. This base image should support secure boot for the Compute Engine instances deployed from this image. How would you automate the image creation?

Prepare a shell script. Add the command gcloud compute instances stop with the Node.js instance name. Set up certificates for secure boot. Add gcloud compute images create, and specify the Compute Engine instance's persistent disk and zone and the certificate files. Add gcloud compute images add-iam-policy-binding and specify the 'development' group. You need to stop the Compute Engine instance before creating the image. Use the command gcloud compute images create with the instance's disk and zone. Secure boot requires setting up certificates to establish trust between platform, firmware, and OS.

Cymbal Bank has certain default permissions and access for their analyst, finance, and teller teams. These teams are organized into groups that have a set of role-based IAM permissions assigned to them. After a recent acquisition of a small bank, you find that the small bank directly assigns permissions to their employees in IAM. You have been tasked with applying Cymbal Bank's organizational structure to the small bank. Employees will need access to Google Cloud services.

Reset all user permissions in the small bank's IAM. Use Cloud Identity to create dynamic groups for each of the bank's teams. Use the dynamic groups' metadata field for team type to allocate users to their appropriate group with a Python script Use Dynamic Groups to create groups based on Identity attributes, such as department, and place the users in a flat hierarchy. Dynamic group metadata helps build the structure to identify the users.

You are a cloud engineer at Cymbal Bank. You need to share the auditing and compliance standards with your CTO that cover controls over financial reporting and both public and private controls over security, availability, and confidentiality. Which compliance standard covers this? A. FIPs 140-2 B. GDPR C. PCI-DSS D. SOX

SOX covers controls over financial reporting and both public and private controls over security, availability, and confidentiality

Cymbal Bank's management is concerned about virtual machines being compromised by bad actors. More specifically, they want to receive immediate alerts if there have been changes to the boot sequence of any of their Compute Engine instances. What should you do?

Set an organization-level policy that requires all Compute Engine VMs to be configured as Shielded VMs. Use Measured Boot enabled with Virtual Trusted Platform Module (vTPM). Validate integrity events in Cloud Monitoring and place alerts on late boot validation events. Shielded VMs support Integrity Monitoring, which determines whether a VM instance's boot sequence has changed. The integrity baseline policy is created when you boot a VM for the first time. Every subsequent reboot will now generate and compare Early Boot and Late Boot events against the integrity baseline polic

The loan application from Cymbal Bank's lending department collects credit reports that contain credit payment information from customers. According to bank policy, the PDF reports are stored for six months in Cloud Storage, and access logs for the reports are stored for three years. You need to configure a cost-effective storage solution for the access logs. 4.2 What should you do?

Set up a logging export bucket in Cloud Storage to collect data from Cloud Audit Logs. Configure object lifecycle management rules to delete logs after three years. Cloud Audit Logs has provisions for data access logs that are required in

You have recently joined Cymbal Bank as a cloud security engineer. You want to encrypt a connection from a user on the internet to a VM in your development project. This is at the layer 3/4 (network/transport) level and you want to set up user configurable encryption for the in transit network traffic. What architecture choice best suits this use case?

Set up an IPsec tunnel. This will allow you to create L3/L4 encryption between a user and a VM instance in her project. Set up an IPsec tunnel. This will allow you to create L3/L4 encryption between a user and a VM instance in her project.

Cymbal Bank calculates employee incentives on a monthly basis for the sales department and on a quarterly basis for the marketing department. The incentives are released with the next month's salary. Employee's performance documents are stored as spreadsheets, which are retained for at least one year for audit. You want to configure the most cost-effective storage for this scenario. What should you do?

Upload the spreadsheets to Cloud Storage. Select the Nearline storage class for the sales department and Coldline storage for the marketing department. Use object lifecycle management rules to set the storage class to Archival after 365 days. Process the data on BigQuery using jobs that run monthly for Sales and quarterly for Marketing. Cloud Storage storage classes let you lower the storage cost for data that you access less frequently and don't require for real-time applications. Use object lifecycle rules to change storage classes and expire data. For processing, use BigQuery, which has a free daily quota.

Cymbal Bank has suffered a remote botnet attack on Compute Engine instances in an isolated project. The affected project now requires investigation by an external agency. An external agency requests that you provide all admin and system events to analyze in their local forensics tool. You want to use the most cost-effective solution to enable the external analysis. 4.2 What should you do?

Use Cloud Audit Logs. Filter Admin Activity audit logs for only the affected project. Use a Pub/Sub topic to stream the logs from Cloud Audit Logs to the external agency's forensics tool. Cloud Audit Logs by default has admin and system activity logged. Filter the logs and use a Pub/Sub topic to send the logs to the external agency's forensics tool.

Cymbal Bank has acquired a non-banking financial company (NBFC). This NBFC uses Active Directory as their central directory on an on-premises Windows Server. You have been tasked with migrating all the NBFC users and employee information to Cloud Identity.

Use Cloud VPN to connect the on-premises network to your Google Cloud environment. Select an on-premises domain-joined Windows Server. On the domain-joined Windows Server, run Configuration Manager and Google Cloud Directory Sync. Use Cloud VPN's encrypted channel to transfer users from the on-premises Active Directory to Cloud Identity. If you are in an on-premises environment, you can access Active Directory using LDAP. Google Cloud Directory Sync to Cloud Identity communication will be over an HTTPs channel using Cloud VPN.

Cymbal Bank experienced a recent security issue. A rogue employee with admin permissions for Compute Engine assigned existing Compute Engine users some arbitrary permissions. You are tasked with finding all these arbitrary permissions. 4.2 What should you do to find these permissions most efficiently?

Use Event Threat Detection and configure Continuous Exports to filter and write only Firewall logs to the Security Command Center. In the Security Command Center, select Event Threat Detection as the source, filter by category: anomalies, and sort to find the attack time window. Click on Evasion: IAM Anomalous Grant to display Finding Details. View the Source property of the Finding Details section. Event Threat Detection has triggers that detect anomalies on specified filters. Event Threat Detection can publish results to the Security Command Center. From the Security Command Center, filter and navigate to find all anomalies for the affected users.

Your organization has a website running on Compute Engine. This instance only has a private IP address. You need to provide SSH access to an on-premises developer who will debug the website from the authorized on-premises location only. How do you enable this?

Use Identity-Aware Proxy (IAP). Set up IAP TCP forwarding by creating ingress firewall rules on port 22 for TCP using the gcloud command. IAP TCP forwarding establishes an encrypted tunnel that supports both SSH and RDP requests.

Cymbal Bank's Customer Details API runs on a Compute Engine instance with only an internal IP address. Cymbal Bank's new branch is co-located outside the Google Cloud points-of-presence (PoPs) and requires a low-latency way for its on-premises apps to consume the API without exposing the requests to the public internet. Which solution would you recommend?

Use Partner Interconnect. Use a service provider to access their enterprise grade infrastructure to connect to the Google Cloud environment When you are co-located in one of the Google Cloud PoPs, use Dedicated Interconnect. Otherwise, use Partner Interconnect to connect to Google Cloud with a private IP address.

Cymbal Bank has a Cloud SQL instance that must be shared with an external agency. The agency's developers will be assigned roles and permissions through a Google Group in Identity and Access Management (IAM). The external agency is on an annual contract and will require a connection string, username, and password to connect to the database. How would you configure the group's access?

Use Secret Manager. Use the duration attribute to set the expiry period to one year. Add the secretmanager.secretAccessor role for the group that contains external developers. Secret Manager supports time types such as absolute time duration to invoke and revoke access. The Secret Assessor role is required to read the stored secrets in Secret Manager.

Cymbal Bank wants to deploy an n-tier web application. The frontend must be supported by an App Engine deployment, an API with a Compute Engine instance, and Cloud SQL for a MySQL database. This application is only supported during working hours, App Engine is disabled, and Compute Engine is stopped. How would you enable the infrastructure to access the database?

Use VM metadata to read the current machine's IP address and use a startup script to add access to Cloud SQL. Store Cloud SQL's connection string, username, and password in Secret Manager. Using the VM's metadata is the correct approach because the current machine will be able to provide its IP address, and requests will stay inside the host VM. A startup script can read a VM session and fetch the IP address. It can additionally configure the IP address to access Cloud SQL. Storing all data in Secret Manager lets you share sensitive information without revealing its text.

An external audit agency needs to perform a one-time review of Cymbal Bank's Google Cloud usage. The auditors should be able to access a Default VPC containing BigQuery, Cloud Storage, and Compute Engine instances where all the usage information is stored. You have been tasked with enabling the access from their on-premises environment, which already has a configured VPN. What should you do?

Use a Cloud VPN tunnel. Use Cloud DNS to create DNS zones and records for *.googleapis.com. Set up on-premises routing with Cloud Router. Use Cloud Router custom route advertisements to announce routes for Google Cloud destinations Cloud VPN provides a cost-effective and easily set-up environment for on-premises users to access Google Cloud privately. Using *.googleapis.com enables requests for both private.googleapis.com and restricted.googleapis.com. Use Cloud Router to set up and announce Google Cloud routes on-premises.

Cymbal Bank publishes its APIs through Apigee. Cymbal Bank has recently acquired ABC Corp, which uses a third-party identity provider. You have been tasked with connecting ABC Corp's identity provider to Apigee for single sign-on (SSO). You need to set up SSO so that Google is the service provider. You also want to monitor and log high-risk activities. Which two choices would you select to enable SSO?

Use openssl to generate public and private keys. Store the public key in an X.509 certificate, and encrypt using RSA or DSA for SAML. Sign in to the Google Admin console, and under Security, upload the certificate The first step is to generate a set of public and private keys. The public key is then stored in an X.509 certificate encrypted with RSA or DSA. Navigate to the Google Admin console to upload the certificate. The generated private key will be used to sign the SAML messages and responses.

Cymbal Bank has designed an application to detect credit card fraud that will analyze sensitive information. The application that's running on a Compute Engine instance is hosted in a new subnet on an existing VPC. Multiple teams who have access to other VMs in the same VPC must access the VM. You want to configure the access so that unauthorized VMs or users from the internet can't access the fraud detection VM. What should you do?

Use subnet isolation. Create a service account for the fraud detection engine. Create service accounts for each of the teams' Compute Engine instances that will access the engine. Add a firewall rule using: gcloud compute firewall-rules create ACCESS_FRAUD_ENGINE --network --allow TCP:80 --source-service-accounts --target-service-accounts Using subnet isolation, you have to authorize every request entering your subnet. The recommended solution is to create a firewall rule that allows only a limited set of service accounts to access the shared target.

Cymbal Bank has hired a data analyst team to analyze scanned copies of loan applications. Because this is an external team, Cymbal Bank does not want to share the name, gender, phone number, or credit card numbers listed in the scanned copies. You have been tasked with hiding this PII information while minimizing latency. What should you do?

Use the Cloud Data Loss Prevention (DLP) API to make redact image requests. Provide your project ID, built-in infoTypes, and the scanned copies when you make the requests. The DLP API can be directly used for image redaction. Built-in infoTypes already include name, gender, phone number, and credit card numbers.

Cymbal Bank's Insurance Analyst needs to collect and store anonymous protected health information of patients from various hospitals. The information is currently stored in Cloud Storage, where each hospital has a folder that contains its own bucket. You have been tasked with collecting and storing the healthcare data from these buckets into Cymbal Bank's Cloud Storage bucket while maintaining HIPAA compliance. What should you do?

Use the Cloud Healthcare API to read the data from the hospital buckets and use de-identification to redact the sensitive information. Use Dataflow to ingest the Cloud Healthcare API feed and write data in a new Project that contains the Cloud Storage bucket. Give the Insurance Analyst the 'Editor' role on this Project. The Cloud Healthcare API has a de-identification module to redact patient information and is already HIPAA-compliant. You can then use Dataflow to read the information from source and write into a target bucket with anonymization for further analysis.

Cymbal Bank has Docker applications deployed in Google Kubernetes Engine. The bank has no offline containers. This GKE cluster is exposed to the public internet and has recently recovered from an attack. Cymbal Bank suspects that someone in the organization changed the firewall rules and has tasked you to analyze and find all details related to the firewall for the cluster. You want the most cost-effective solution for this task. 4.2 What should you do?

View the GKE logs in Cloud Logging. Use the log scoping tool to filter the Firewall Rules log. Create a dataset in BigQuery to accept the logs. Export the logs to BigQuery using the command gcloud logging sinks create. Query this dataset. Cloud Logging is enabled by default for GKE clusters. The Firewall Rules log captures all changes made to the firewall. BigQuery can be used to capture the data and analyze it using the free quota.

You have recently joined Cymbal Bank as a cloud engineer. You created a custom VPC network, selecting to use the automatic subnet creation mode and nothing else. The default network still exists in your project. You create a new Linux VM instance and select the custom VPC as the network interface. You try to SSH into your instance, but you are getting a "connection failed" error. What answer best explains why you cannot SSH into the instance?

You did not set up any firewall rules on your custom VPC network. While the default VPC comes with a predefined firewall rule that allows SSH traffic, these need to be added to any custom VPCs. You did not create any firewalls to allow SSH traffic.

Your colleague at Cymbal Bank is a cloud security engineer. She sketches out the following solution to manage her team's access to application security keys: 1 - Create 2 projects ● Project A: Cloud Storage to store secrets ● Project B: Cloud KMS to manage encryption keys 2 - Store each secret individually in Cloud Storage 3 - Rotate secrets and encryption keys regularly 4 - Protect each bucket by using encryption with Cloud KMS What (if any) step does not follow Google Cloud's best practices for secret management?

Your colleague's proposal follows Google Cloud's best practices for managing access to application security keys.

Cymbal Bank has published an API that internal teams will use through the HTTPS load balancer. You need to limit the API usage to 200 calls every hour. Any exceeding usage should inform the users that servers are busy. Which gcloud command would you run to throttle the load balancing for the given specification?

gcloud compute security-policies rules create priority --security-policy sec-policy --src-ip-ranges=source-range --action=throttle --rate-limit-threshold-count=200 --rate-limit-threshold-interval-sec=3600 --conform-action=allow --exceed-action=deny-429 --enforce-on-key=HTTP-HEADER Action should be set to throttle, rate-limit-threshold-count must be 200, and rate-limit-threshold-interval-sec for 1 hour must be 60 seconds X 60 = 3600 seconds. A 429 error code will convey to the user that they have placed too many requests. Note:404 will indicate that the resource was not found, which is not the error code you want to convey. Action should be allowed, not denied. Error 403 is incorrect; it indicates invalid authorization, which is not your use case. Action should be allowed, not denied. Error 500 indicates internal server error, which would mean your API had an exception or failure, which is not the expected outcome


संबंधित स्टडी सेट्स

infant, child and adolescent nutrition

View Set

Checkpoint: Random Variables 1 and 2

View Set

Chapter 13 - Viruses Viroids and Prions - MIC 205A

View Set

Earth and Environmental Practice Quiz

View Set