DP-900 Past Questions Dumps

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Descriptive Analytics tells you 1. What is most likely to occur in the future 2. What occured in the past 3. Which actions you can perform to affect outcomes 4. Why something happened in the past

2. What occured in the past

You are reviewing the data model shown in the following exhibit: Customer -> Sales Key: CustomerID Foreign Keys: None Product -> Sales Key: ProductID Foreign Keys: None SalesPerson -> Sales Key: SalesPersonID Foreign Keys: None Warehouse -> Sales Key: WarehouseID Foreign Keys: None Sales: Key: SalesID Foreign Keys: CustomerID, ProductID, SalesPersonID, WarheouseID Customer is ......table 1. fact 2. dimension 3. bridge

2. dimension

Which of the following operating system platforms is Azure Data Studio supported on? ​ Windows, Mac OS, and Linux ​ Windows ​ Windows, Mac OS, Linux, iOS and Android ​ Windows and Linux

Windows, Mac OS, and Linux Azure Data Studio is downloadable software, open source, and runs on Windows, Mac and Linux

Transcribing videos is an example of _______________________________________ ​ cognitive ​ descriptive ​ predictive ​ prescriptive

cognitive Transcribing comes under cognitive. https://azure.microsoft.com/en-us/services/cognitive-services/speech-services/

f you set up your SQL Database networking with "no access", which type of user can connect to the database? ​ No users ​ Private users ​ Public users with the correct access username and passord ​ Admin users

​ No users Setting No Access means that the database cannot be connected to. You'd need to configure access for someone to be able to use the database for data storage

Which of the following Azure databases would be considered as "Infrastructure as a service"? Choose one. ​ SQL Server in a VM Azure SQL Database ​ Cosmos DB ​ SQL Managed Instance

​ SQL Server in a VM With SQL Server in a VM, you can choose the operating system, and specific software versions for SQL Server. You are also responsible for operating system and database software upgrades.

A database object that holds data 1. Index 2. View 3. Table

3. Table

Support Azure Directory (Azure AD) sign-ins to an Azure SQL database. 1. Authentication 2. Firewall 3. Encryption

1. Authentication

Match the Azure services to appropriate requirements. Output: Data to Parquet format 1. Azure Data Factory 2. Azure Data Lake Storage 3. Azure SQL Database 4. Azure Synapse Analytics

1. Azure Data Factory

You have the following SQL query. INSERT INTO dbo.Products (ProductID, ProductName, Price, ProductDescription) VALUES (1, 'Clamp', 12.48, 'Workbench clamp') ;a What is ProductName? 1. A column 2. A database 3. A table 4. An index

1. A column

JSON documents 1. Cassandra API 2 Gremlin API 3 MongoDB API 4 Table API

3 MongoDB API

A company wants to load data from a customer relationship management system onto a data warehouse by using an extract , load and transform process. Which of the following is the stage in the ELT process you would perform the process of Transform? 1. An in-memory data integration tool 2. The CRM system 3. The data warehouse

1. An in-memory data integration tool Since the transformation is going to occur in the destination , that is the data warehouse, there needs to be an in-memory data integration tool to perform the transformation.

1. A clustered index 2. A Filetable 3. A Foreign Key 4. A stored procedure ...is an object associated with a table that sorts and stores the data rows in the table based on their key values

1. A clustered index

ou are developing the data platform for a global retail company. The company operates during normal working hours in each region. The analytical database is used once a week for building sales projections. Each region maintains its own private virtual network. Building the sales projections is very resource intensive are generates upwards of 20 terabytes (TB) of data. Microsoft Azure SQL Databases must be provisioned. - Database provisioning must maximize performance and minimize cost -The daily sales for each region must be stored in an Azure SQL Database instance -Once a day, the data for all regions must be loaded in an analytical Azure SQL Database instance You need to provision Azure SQL database instances. How should you provision the database instances? 1. Azure SQL Database elastic pools 2. Azure SQL Database Hyperscale 3. Azure SQL Database Premium 4. Azure SQL Database Managed Instance DATABASE : Weekly Analysis - _________BOX 1_____________ Daily Sales __________BOX 2____________ What will the appropriate answer for Box 1 ?

1. Azure SQL Database elastic pools SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands. The databases in an elastic pool are on a single Azure SQL Database server and share a set number of resources at a set price. Elastic pools in Azure SQL Database enable SaaS developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity for each database.

Transcribing audio files is an example of: 1. Cognitive 2. Diagnostic 3. Descriptive 4. Predictive 5. Prescriptive

1. Cognitive

Match the types of analytics that can be used to answer the business questions. Which people are mentioned in a company's business documents? 1. Cognitive 2. Diagnostic 3. Descriptive 4. Predictive 5. Prescriptive

1. Cognitive Cognitive analytics attempt to draw inferences from existing data and patterns...

Match the types of activities to the appropriate Azure Data Factory activities. Until 1. Control 2. Data Movement 3. Data transformation

1. Control

You have to work with the Azure Data Factory service. You need to map the Azure Data Factory components to the right descriptions. Which of the following component should be mapped to the following description? " A representation of data structures within data stores" 1. Dataset 2. Linked Service 3. Mapping Data flow 4. Pipeline

1. Dataset

When using the Azure Cosmos DB Gremlin API, the container resource type is projected as a 1. Graph 2. Table 3. Partition key 4. Document

1. Graph

A database object that helps improve the speed of data retrieval 1. Index 2. View 3. Table

1. Index

Which of the following needs to be used to build a Microsoft Power BI paginated report? 1. Power BI Report Builder 2. Power BI Desktop 3. Charticulator 4. The Power BI Service

1. Power BI Report Builder To build a paginated report, you need to use the PowerBI Report builder.

Which of the following are activities that are performed in the Microsoft Power BI service? Choose 2 answers from the options given below 1. Report and dashboard creation 2. Report sharing and distribution 3. Data modelling 4. Data acquisition and preparation

1. Report and dashboard creation 2. Report sharing and distribution

Your company currently has a transactional application that stores data in an Azure SQL Managed instance. In which of the following circumstances would you need to implement a read-only database replica? 1. You need to generate reports without affecting the transactional workload 2. You need to audit the transactional application 3. You need to implement high availability in the event of a regional outage 4. You need to improve the recovery point objective (RPO)

1. You need to generate reports without affecting the transactional workload A read-only instance can be used to reduce the workload on your OLTP database. You can generate reports from the read-only copy of the database.

In Azure Data Factory, you can use ....to orchestrate pipeline activities that depend on the output of other pipeline activities. 1. a control flow 2. a dataset 3. a linked service 4. an integration runtime

1. a control flow

You have the following JSON document. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the JSON document. "customer": { "first name" : "Ben", "last name" : "Smith", "address": { "line 1" : "161 Azure Ln", "line 2" : "Palo Alto", "ZIP code" : "54762" }, "social media": [ { "service" : "twitter", "handle" : "@bensmith", }, { "service" : "linkedin", "handle" : "bensmith", } ], "phone numbers": [ { "type" : "mobile", "number" : "555-555-555", } ] } Social Media is: 1. a nested array 2. a nested object 3. a root object

1. a nested array

You can query a graph database in Azure Cosmos DB 1. as a JSON document by using a SQL-like language 2. as a partitioned row store by using Cassandra Query Language (CQL) 3. as a partitioned row store by using Language-Integrated query (LINQ) 4. as nodes and edges using the Gremlin language.

1. as a JSON document by using a SQL-like language

At which of the following levels can you set the throughput for an Azure Cosmos DB account? Choose 2 answers from the options given below 1. container 2. item 3. database 4. partition

1. container and 3. database

The massively parallel processing (MPP) engine of Azure Synapse Analytics: 1. distributes processing across compute notes 2. distributes processing across control nodes 3. redirects client connections across compute notes 4. redirects client connections across control nodes

1. distributes processing across compute notes

To configure an Azure Storage account to support both security at folder level and atomic directory manipulation, 1. enable the heirarchical namespace 2. set Account kind to Blobstorage 3. set Performance to Premium 4. set Replication to Read-access geo-redundant storage (RA-GRS)

1. enable the heirarchical namespace

Graph Data 1. Cassandra API 2 Gremlin API 3 MongoDB API 4 Table API

2 Gremlin API

Match the Azure services to appropriate requirements. Store data that is in Parquet format 1. Azure Data Factory 2. Azure Data Lake Storage 3. Azure SQL Database 4. Azure Synapse Analytics

2. Azure Data Lake Storage

Which of the following is an Azure storage solution that provides native support for POSIX-compliant access control lists? 1. Azure Queue Storage 2. Azure Data Lake Storage 3. Azure table storage 4. Azure files

2. Azure Data Lake Storage

You need to select the appropriate an Azure service that would be used for a particular requirement. Which of the following would you use for the following requirement? "Store data in Parquet format" 1. Azure Data Factory 2. Azure Data Lake Storage 3. Azure SQL Database 4. Azure Synapse Analytics

2. Azure Data Lake Storage You can store files in virtually any data format in Azure Data Lake Storage.

You have to write a set of queries that will help administrators to troubleshoot an Azure SQL database. You have to be able to embed the documents and query results into a SQL notebook. Which of the following would you use for this requirement? ​ 1. Microsoft SQL Server Management Studio (SSMS) 2. Azure Data Studio 3. Azure CLI 4. Azure Powershell

2. Azure Data Studio You can use SQL Notebooks in Azure Data Studio.

You are developing the data platform for a global retail company. The company operates during normal working hours in each region. The analytical database is used once a week for building sales projections. Each region maintains its own private virtual network. Building the sales projections is very resource intensive are generates upwards of 20 terabytes (TB) of data. Microsoft Azure SQL Databases must be provisioned. Database provisioning must maximize performance and minimize cost The daily sales for each region must be stored in an Azure SQL Database instance Once a day, the data for all regions must be loaded in an analytical Azure SQL Database instance You need to provision Azure SQL database instances. How should you provision the database instances? 1. Azure SQL Database elastic pools 2. Azure SQL Database Hyperscale 3. Azure SQL Database Premium 4. Azure SQL Database Managed Instance DATABASE : Weekly Analysis - _________BOX 1_____________ Daily Sales __________BOX 2____________ What will the appropriate answer for Box 2 ?

2. Azure SQL Database Hyperscale A Hyperscale database is an Azure SQL database in the Hyperscale service tier that is backed by the Hyperscale scale-out storage technology. A Hyperscale database supports up to 100 TB of data and provides high throughput and performance, as well as rapid scaling to adapt to the workload requirements. Scaling is transparent to the application "" connectivity, query processing, and so on, work like any other SQL database. Below is incorrect Azure SQL Database Managed Instance: The managed instance deployment model is designed for customers looking to migrate a large number of apps from on- premises or IaaS, self-built, or ISV provided environment to fully managed PaaS cloud environment, with as low migration effort as possible.

Match the types of activities to the appropriate Azure Data Factory activities. Copy 1. Control 2. Data Movement 3. Data transformation

2. Data Movement

You need to match the right type of analytics that can be used for the following business scenario? "Which people are mentioned in a company's business documents" 1. Cognitive 2. Descriptive 3. Diagnostics 4. Predictive 5. Prescriptive

2. Descriptive This is an example of Descriptive analytics. Here you are describing information about the people mentioned in the business documents.

Match the types of analytics that can be used to answer the business questions. Why did sales increase last month? 1. Cognitive 2. Diagnostic 3. Descriptive 4. Predictive 5. Prescriptive

2. Diagnostic Diagnostic analytics help answer questions about why events happened

Prevent Access to an Azure SQL database from another network 1. Authentication 2. Firewall 3. Encryption

2. Firewall

Your company needs to design a database that shows how traffic changes in one area of a network affect other components on the network. Which of the following is a data store type you would use for this requirement? 1. Key/value 2. Graph 3. DOcument 4. Columnar

2. Graph It would be ideal to define the network as nodes in a graph-based database. You can define the relationship between the networks as edges between the nodes.

Which of the following component should be mapped to the following description? "The information used to connect to external data sources" 1. Dataset 2. Linked Service 3. Mapping Data flow 4. Pipeline

2. Linked Service

A database object whose content is defined by a query 1. Index 2. View 3. Table

2. View

You have the following JSON document. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the JSON document. "customer": { "first name" : "Ben", "last name" : "Smith", "address": { "line 1" : "161 Azure Ln", "line 2" : "Palo Alto", "ZIP code" : "54762" }, "social media": [ { "service" : "twitter", "handle" : "@bensmith", }, { "service" : "linkedin", "handle" : "bensmith", } ], "phone numbers": [ { "type" : "mobile", "number" : "555-555-555", } ] } Address is: 1. a nested array 2. a nested object 3. a root object

2. a nested object

You have the following SQL query. INSERT INTO dbo.Products (ProductID, ProductName, Price, ProductDescription) VALUES (1, 'Clamp', 12.48, 'Workbench clamp') ;a What is dbo.Products? 1. A column 2. A database 3. A table 4. An index

3. A table

.....is a virtual table that contains content defined by a query 1. A heap 2. A stored procedure 3. A view 4. An index

3. A view

By default, each Azure SQL database is protected by 1. a Network Security Group 2. a server level firewall 3. Azure Firewall 4. Azure Front Door

3. Azure Firewall

Match the types of activities to the appropriate Azure Data Factory activities. Mapping data flow 1. Control 2. Data Movement 3. Data transformation

3. Data transformation

Ensure that sensitive data never appears as plain text in an Azure SQL Database 1. Authentication 2. Firewall 3. Encryption

3. Encryption

You currently manage an application that stores data within a shared folder on a Windows server. You have to move the shared folder to Azure Storage. Which of the following service within an Azure storage account could you use for this purpose? 1. Queue 2. Blob 3. File 4. Table

3. File. You can use the File service to create file shares. These file shares can be mounted on Windows Servers.

Your company needs a non-relational data store that is optimized for storing and retrieving files, videos, audio streams and virtual disk images. The data store must be able to store data , metadata and a unique ID for each file. Which of the following is a data store type you would choose for this requirement? ​ 1. ​Document 2. Key/value 3. Object 4. Columnar

3. Object. For this you can use an Object data store like Azure Blob Storage. Here you can have metadata for each object and also a unique URL.

A company wants to load data from a customer relationship management system onto a data warehouse by using an extract , load and transform process. Which of the following is the stage in the ELT process you would perform the process of Load? 1. An in-memory data integration tool 2. The CRM system 3. The data warehouse

3. The data warehouse Here since the data needs to be loaded into the data warehouse, this will be the stage for the Load process.

You have the following JSON document. Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the JSON document. "customer": { "first name" : "Ben", "last name" : "Smith", "address": { "line 1" : "161 Azure Ln", "line 2" : "Palo Alto", "ZIP code" : "54762" }, "social media": [ { "service" : "twitter", "handle" : "@bensmith", }, { "service" : "linkedin", "handle" : "bensmith", } ], "phone numbers": [ { "type" : "mobile", "number" : "555-555-555", } ] } Customer is: 1. a nested array 2. a nested object 3. a root object

3. a root object

Relational data uses ......to enforce relationships between different tables 1. collections 2. columns 3. keys 4. partitions

3. keys

In batch processing: 1. data is always inserted one row at a time 2. data is processed in real-time 3. latency is expected 4. processing car only execute serially

3. latency is expected

A Microsoft Power BI ....enables users to create highly fixed-layout documents for archiving 1. dashboard 2. interactive report 3. paginated report 4. subscription

3. paginated report

You are reviewing the data model shown in the following exhibit: Customer -> Sales Key: CustomerID Foreign Keys: None Product -> Sales Key: ProductID Foreign Keys: None SalesPerson -> Sales Key: SalesPersonID Foreign Keys: None Warehouse -> Sales Key: WarehouseID Foreign Keys: None Sales: Key: SalesID Foreign Keys: CustomerID, ProductID, SalesPersonID, WarheouseID The data model is 1. transactional model 2. star schema 3. snowflake schema

3. snowflake schema

Transparent Data Encryption (TDE) encrypts 1. A column to protect data at rest and in transit 2. queries and their results in order to protect data in transit 3. the database to protect data at rest 4. the server to protect data at rest

3. the database to protect data at rest

Key/Value data 1. Cassandra API 2 Gremlin API 3 MongoDB API 4 Table API

4 Table API

Match the Azure services to appropriate requirements. Persist a tabular representation of data that is stored in Parquet format 1. Azure Data Factory 2. Azure Data Lake Storage 3. Azure SQL Database 4. Azure Synapse Analytics

4. Azure Synapse Analytics

You need to select the appropriate an Azure service that would be used for a particular requirement. Which of the following would you use for the following requirement? "Persist a tabular representation of data that is stored in Parquet format" 1. Azure Data Factory 2. Azure Data Lake Storage 3. Azure SQL Database 4. Azure Synapse Analytics

4. Azure Synapse Analytics The tabular representation of the data is normally stored in Azure Synapse Analytics.

Which of the following component should be mapped to the following description? " A logical grouping of activities that performs a unit of work and can be scheduled" 1. Dataset 2. Linked Service 3. Mapping Data flow 4. Pipeline

4. Pipeline

Match the types of analytics that can be used to answer the business questions. How do I allocate my budget to buy different inventory items? 1. Cognitive 2. Diagnostic 3. Descriptive 4. Predictive 5. Prescriptive

4. Predictive Predictive analytics help answer questions about what will happen in the future

At which of the following level in Azure Cosmos DB can you configure multiple write regions and read regions? 1. database 2. partition 3. collection 4. account

4. account. This is done at the account level.

Batch workloads 1. process data in memory, row-by-row 2. collect and process data at most once a day 3. process data as new data is received in near real time 4. collect data and then process the data when a condition is met

4. collect data and then process the data when a condition is met

A relational database must be used when.... 1. a dynamic schema is required 2. data will be stored as key/value pairs 3. storing large images and videos 4. strong consistency guarantees are required

4. strong consistency guarantees are required

An ETL process requires: 1. A matching schema in the data source and the data target 2. A target data store powerful enough to transform data 3. data that is fully processed before being loaded to the target data store 4. that the data target be a relational database

4. that the data target be a relational database

How much data can be stored in a single Table Storage account? ​ 100 TB ​ 500 TB ​ 5 PB ​ Unlimited

5 PB Storage accounts can hold up to 5 PB of data including table storage.

You need to match the right type of analytics that can be used for the following business scenario? "How do I allocate my budget to buy different inventory items" 1. Cognitive 2. Descrptive 3. Diagnostic 4. PRedictive 5. Prescriptive

5. Prescriptive This should fall under prescriptive analysis. Here you want to know how to approach to fulfil a target requirement based on the data that is available.

How does a relational database enforce data integrity? ​ Prevents records from being deleted if other records rely on them Prevents records from being inserted if records they rely on don't exist ​ Ensures every column follows the data type of it's schema definition Prevents null values on columns which do not allow null values

All of the above. Relational databases ensure the data being inserted matches the schema exactly as it's defined, and all foreign keys exist.

What is Locally Redundant Storage? A) It's where data is copied in roughly the same physical area. B) It's where data is copied in multiple regions. C) It's where data is copied in another region.

A) It's where data is copied in roughly the same physical area.

Question 44: When I write an SQL statement, what is the order of the six SQL principal clauses? A) SELECT, FROM, WHERE, GROUP BY, HAVING, ORDER BY B) SELECT, FROM, GROUP BY, WHERE, HAVING, ORDER BY C) FROM, WHERE, GROUP BY, HAVING, ORDER BY, SELECT D) FROM, ORDER BY, SELECT, WHERE, GROUP BY, HAVING

A) SELECT, FROM, WHERE, GROUP BY, HAVING, ORDER BY

You need to store data in Azure Blob storage for seven years to meet your company's compliance requirements. The retrieval time of the data is unimportant. The solution must minimize storage costs.Which storage tier should you use? A. Archive B. Hot C. Cool

A. Archive

You need to design and model a database by using a graphical tool that supports project-oriented offline database development.What should you use? A. Microsoft SQL Server Data Tools (SSDT) B. Microsoft SQL Server Management Studio (SSMS) C. Azure Databricks D. Azure Data Studio

A. Microsoft SQL Server Data Tools (SSDT)

You need to query a table named Products in an Azure SQL database.Which three requirements must be met to query the table from the internet? Each correct answer presents part of the solution. (Choose three.) A. You must be assigned the Reader role for the resource group that contains the database. B. You must have SELECT access to the Products table. C. You must have a user in the database. D. You must be assigned the Contributor role for the resource group that contains the database. E. Your IP address must be allowed to connect to the database.

A. You must be assigned the Reader role for the resource group that contains the database. C. You must have a user in the database. E. Your IP address must be allowed to connect to the database.

At which two levels can you set the throughput for an Azure Cosmos DB account? Each correct answer presents a complete solution. (Choose two.)NOTE: Each correct selection is worth one point. A. database B. item C. container D. partition

A. database C. container

Which type of non-relational data store supports a flexible schema, stores data as JSON files, and stores the all the data for an entity in the same document? A. document B. columnar C. graph D. time series Reveal Solution

A. document

Which two activities can be performed entirely by using the Microsoft Power BI service? Each correct answer presents a complete solution. (Choose two.)NOTE: Each correct selection is worth one point. A. report and dashboard creation B. report sharing and distribution C. data modeling D. data acquisition and preparation

A. report and dashboard creation D. data acquisition and preparation

You have an Azure Cosmos DB account that uses the Core (SQL) API.Which two settings can you configure at the container level? Each correct answer presents a complete solution. (Choose two.)NOTE: Each correct selection is worth one point. A. the throughput B. the read region C. the partition key D. the API

A. the throughput C. the partition key

You need to query a table named Products in an Azure SQL database. Which three requirements must be met to query the table from the internet? (Each correct answer presents part of the solution. Choose three.) A. You must be assigned the Reader role for the resource group that contains the database. B. You must have SELECT access to the Products table. C. You must have a user in the database. D. You must be assigned the Contributor role for the resource group that contains the database. E. Your IP address must be allowed to connect to the database.

ACE

Which statement would you use to include another column in the table schema in a relational database? ​ INSERT ​ TRUNCATE ​ ALTER ​ UPDATE

ALTER To add a new column to a table, you use the ALTER TABLE statement

You have a custom application that uses the Azure SDK to access CosmosDB. You are receiving an authentication error when you do so, and you realize you haven't passed any type of credential to it. What type of credential does Cosmos DB expect in order to grant you access? ​ Anonymous login ​ Username and password ​ Access Key ​ Client certificate

Access Key You must provide an access key (or SAS token) in order to access the database.

Which of the following are benefits of using the Azure SQL Database service? Choose 2 answers from the options given below ​ Complete control over backup and restore processes ​ Access to the latest features ​ In-database machine learning services ​ Reduced administrative effort for managing the server infrastructure

Access to the latest features ​ Reduced administrative effort for managing the server infrastructure

What is a good example of a paginated report? ​ A dashboard showing all the key metrics of the business, with live updates ​ A line chart showing CPU utilization of the virtual machine, at 6 second intervals. ​ An invoice ​ A drill-down report that allows you to explore the data from many angles

An invoice A paginated report is one that is designed to be printed. It is static. An invoice is a good example of that.

Which of the following is an advantage of using multi-region replication with Cosmos DB? ​ Availability is increased. ​ Data will always be consistent in every region. ​ Increased security for your data.

Availability is increased. Azure Cosmos DB transparently replicates your data to all regions you've associated with your MongoDB account, enabling you to develop applications that require global access to data while providing tradeoffs between consistency, availability, and performance, all with corresponding guarantees.

You are developing a solution that will stream to Azure Stream Analytics. The solution will have both streaming data and reference data. Which input type should you use for the reference data? ​ Azure Cosmos DB ​ · Azure Event Hubs ​ Azure Blob storage ​ Azure IoT Hub

Azure Blob storage Stream Analytics supports Azure Blob storage and Azure SQL Database as the storage layer for Reference Data.

Which of the following is a benefit of the Azure Cosmos DB Table API when compared to Azure Table storage? ​ Azure Cosmos DB Table API supports partitioning ​ Azure Cosmos DB Table API provides resiliency if an Azure region fails ​ Azure Cosmos DB Table API provides a higher storage capacity ​ Azure Cosmos DB Table API supports a multi-master model

Azure Cosmos DB Table API supports a multi-master model Only Azure Cosmos DB supports the multi-master model.

A company plans to use Platform-as-a-Service (PaaS) to create the new data pipeline process. The process must meet the following requirements: Ingest: Access multiple data sources. Provide the ability to orchestrate workflow. Provide the capability to run SQL Server Integration Services packages. Store: Optimize storage for big data workloads Provide encryption of data at rest. Operate with no size limits. Prepare and Train: Provide a fully-managed and interactive workspace for exploration and visualization. Provide the ability to program in R, SQL, Python, Scala, and Java. Provide seamless user authentication with Azure Active Directory. Model & Serve: Implement native columnar storage. Support for the SQL language. Provide support for structured streaming. You need to build the data integration pipeline. Requirements : 1. Ingest 2. Store 3. Prepare and Train: 4. Model & Serve: Which technologies should you use for requirement "Ingest"? ​ Excel ​ Power BI ​ Azure Data Factory ​ Azure Databricks

Azure Data Factory Azure Data Factory pipelines can execute SSIS packages. In Azure, the following services and tools will meet the core requirements for pipeline orchestration, control flow, and data movement: Azure Data Factory, Oozie on HDInsight, and SQL Server Integration Services (SSIS).

Which of the following Azure Service uses the hierarchical namespace to store the data? ​ Azure Synapse Analytic ​ Azure Data Lake Storage ​ Azure Data Factory ​ Azure Cosmos DB

Azure Data Lake Storage Azure Data Lake Storage Gen2 to provide file system performance at object storage scale and prices is the addition of a hierarchical namespace.

A set of developers have computers that run Windows 10 and Ubuntu Desktop. They need to connect and query an Azure SQL database from their computer. The developers require code assistance features such as IntelliSense. Which of the following can be used for this requirement? ​ Azure Data Studio ​ Microsoft SQL Server Management Studio ​ Azure Data Explorer

Azure Data Studio

You are writing a set of SQL queries that administrators will use to troubleshoot an Azure SQL database.You need to embed documents and query results into a SQL notebook.What should you use? ​ Microsoft SQL Server Management Studio (SSMS) ​ Azure Data Studio ​ Azure PowerShell ​ Azure CLI

Azure Data StudioAzure Data Studio is used for SQL Notebooks

A company plans to use Platform-as-a-Service (PaaS) to create the new data pipeline process. The process must meet the following requirements: Ingest: Access multiple data sources. Provide the ability to orchestrate workflow. Provide the capability to run SQL Server Integration Services packages. Store: Optimize storage for big data workloads Provide encryption of data at rest. Operate with no size limits. Prepare and Train: Provide a fully-managed and interactive workspace for exploration and visualization. Provide the ability to program in R, SQL, Python, Scala, and Java. Provide seamless user authentication with Azure Active Directory. Model & Serve: Implement native columnar storage. Support for the SQL language. Provide support for structured streaming. You need to build the data integration pipeline. Requirements : 1. Ingest 2. Store 3. Prepare and Train 4. Model & Serve: Which technologies should you use for requirement "Prepare and Train"? ​ HDInsights Apache Spark ​ Azure DataBricks ​ HDInsights Apache Stom cluster ​ Azure Data factory

Azure DataBricks integration. With Azure Databricks, you can set up your Apache Spark environment in minutes, autoscale and collaborate on shared projects in an interactive workspace. Azure Databricks supports Python, Scala, R, Java and SQL, as well as data science frameworks and libraries including TensorFlow, PyTorch and scikit-learn.

A company plans to use Apache Spark Analytics to analyze intrusion detection data You need to recommend a solution to monitor network and system activities for malicious activities and policy violations. Reports must be produced in an electronic format and sent to management. The solution must minimize administrative efforts. What should you recommend? ​ Azure Data Factory ​ Azure Data Lake ​ Azure Databricks ​ Azure HDInsight

Azure Databricks "Recommendation engines, churn analysis, and intrusion detection are common scenarios that many organizations are solving across multiple industries. They require machine learning, streaming analytics, and utilize massive amounts of data processing that can be difficult to scale without the right tools. Companies like Lennox International, E.ON, and renewables.AI are just a few examples of organizations that have deployed Apache Spark™ to solve these challenges using Microsoft Azure Databricks." https://azure.microsoft.com/es-es/blog/three-critical-analytics-use-cases-with-microsoft-azure-databricks

You are developing a solution on Microsoft Azure. The data at rest layer must meet the following requirements: A) Storage: Serve as a repository for high volumes of large files in various formats. Implement optimized storage for big data analytics workloads. Ensure that data can be organized using a hierarchical structure. B) Batch processing: Use a managed solution for in-memory computation processing. Natively support Scala, Python, and R programming languages. Provide the ability to resize and terminate the cluster automatically. C) Analytical data store: Support parallel processing. Use columnar storage. Support SQL-based languages. You need to identify the correct technologies to build the architecture. Which technologies should you use for Batch Processing? ​ HDInsights Spark ​ Azure Databricks ​ HDInsights Hadoop ​ Azure Data Lake Store

Azure Databricks Big data solutions often use long-running batch jobs to filter, aggregate, and otherwise prepare the data for analysis. Usually these jobs involve reading source files from scalable storage (like HDFS, Azure Data Lake Store, and Azure Storage), processing them, and writing the output to new files in scalable storage. The key requirement of such batch processing engines is the ability to scale out computations, in order to handle a large volume of data. Unlike real-time processing, however, batch processing is expected to have latencies (the time between data ingestion and computing a result) that measure in minutes to hours.

Which of the following Azure Service uses the SMB 3.0 protocol? ​ Azure Synapse Analytics ​ Azure Data Factory ​ Azure File Storage ​ Azure Cosmos DB

Azure File Storage

The data engineering team manages Azure HDInsight clusters. The team spends a large amount of time creating and destroying clusters daily because most of the data pipeline process runs in minutes. You need to implement a solution that deploys multiple HDInsight clusters with minimal effort. What should you implement? ​ Azure Databrick ​ ·Azure Traffic Manager ​ Azure Resource Manager templates ​ Azure Pipelines ​ Azure Data factory

Azure Resource Manager templates A Resource Manager template makes it easy to create the following resources for your application in a single, coordinated operation: HDInsight clusters and their dependent resources (such as the default storage account). In the template, you define the resources that are needed for the application. You also specify deployment parameters to input values for different environments. The template consists of JSON and expressions that you use to construct values for your deployment. Rest Azure Pipelines can fulfill this but for that also we need ARM or some task to do that. So ARM is more closer answer

A company plans to use Platform-as-a-Service (PaaS) to create the new data pipeline process. The process must meet the following requirements: Ingest: Access multiple data sources. Provide the ability to orchestrate workflow. Provide the capability to run SQL Server Integration Services packages. Store: Optimize storage for big data workloads Provide encryption of data at rest. Operate with no size limits. Prepare and Train: Provide a fully-managed and interactive workspace for exploration and visualization. Provide the ability to program in R, SQL, Python, Scala, and Java. Provide seamless user authentication with Azure Active Directory. Model & Serve: Implement native columnar storage. Support for the SQL language. Provide support for structured streaming. You need to build the data integration pipeline. Requirements : 1. Ingest 2. Store 3. Prepare and Train 4. Model & Serve: Which technologies should you use for requirement "Model & Serve"? ​ Azure DataBricks ​ HDInsights Apache Stom cluster​ Azure Data factory ​ Azure SQL DataWarehouse

Azure SQL DataWarehouse SQL Data Warehouse stores data into relational tables with columnar storage. Azure SQL Data Warehouse connector now offers efficient and scalable structured streaming write support for SQL Data Warehouse. Access SQL Data Warehouse from Azure Databricks using the SQL Data Warehouse connector.

our company needs to implement a relational database in Azure. The solution must minimize ongoing maintenance.Which Azure service should you use? ​ Azure HDInsight ​ Azure Cosmos DB ​ SQL Server on Azure virtual machines ​ Azure SQL Database ​ Azure SQL Database installed on VM

Azure SQL Database

You have to deploy a data store for an application. The data store needs to be a relational database that supports Online Transaction Processing (OLTP). Which of the following can be used for this requirement? ​ Azure Cosmos DB ​ Azure Synapse Analytics ​ Azure SQL Database

Azure SQL Database Azure SQL Database is a relational database system

Which of the following Azure Services use the SQL Server database engine? ​ Azure SQL Database ​ Azure Database for MySQL ​ SQL Server in a VM ​ Cosmos DB ​ SQL Managed Instance

Azure SQL Database SQL Server in a VM SQL Managed Instance SQL Server is an iconic database server that has been leading in the enterprise space for over 20 years. SQL Server engine powers SQL Database, SQL Managed Instance, and SQL Server in a VM.

A company manages several on-premises Microsoft SQL Server databases. You need to migrate the databases to Microsoft Azure by using a backup process of Microsoft SQL Server. Which data technology should you use? ​ Azure SQL Database single database ​ ​ Azure SQL Data Warehouse ​ Azure Cosmos DB ​ Azure SQL Database Managed Instance

Azure SQL Database Managed Instance Managed instance is a new deployment option of Azure SQL Database, providing near 100% compatibility with the latest SQL Server on-premises Database Engine, providing a native virtual network (VNet) implementation that addresses common security concerns, and a business model favorable for on-premises SQL Server customers. The managed instance deployment model allows existing SQL Server customers to lift and shift their on-premises applications to the cloud with minimal application and database changes.

Which Azure SQL offering supports automatic database scaling and automatic pausing of the database during inactive periods? ​ Azure SQL Database serverless ​ Azure SQL Database Hyperscale Azure SQL managed instance

Azure SQL Database serverless

Which of the following is an Azure SQL database offering that supports automatic database scaling and automatic pausing of the database during inactive periods? ​ Azure SQL Database Hyperscale ​ Azure SQL managed instance ​ Azure SQL Database serverless ​ Azure SQL Database elastic pool

Azure SQL Database serverless

You are developing a solution on Microsoft Azure. The data at rest layer must meet the following requirements: A) Storage: Serve as a repository for high volumes of large files in various formats. Implement optimized storage for big data analytics workloads. Ensure that data can be organized using a hierarchical structure. B) Batch processing: Use a managed solution for in-memory computation processing. Natively support Scala, Python, and R programming languages. Provide the ability to resize and terminate the cluster automatically. C) Analytical data store: Support parallel processing. Use columnar storage. Support SQL-based languages. You need to identify the correct technologies to build the architecture. Which technologies should you use for Analytics Data Store ? ​ HDInsights Spark ​ Azure Databricks ​ HDInsights Hadoop Azure SQL Datawarehouse

Azure SQL Datawarehouse Azure SQL Datawarehouse SQL Data Warehouse is a cloud-based Enterprise Data Warehouse (EDW) that uses Massively Parallel Processing (MPP). SQL Data Warehouse stores data into relational tables with columnar storage.

Which of the following tool can be used to access data stored in Azure Storage Account? ​ Azure Storage Explorer ​ SQL Server Management Studio ​ SQL Server Data Tools ​ Azure Data Studio

Azure Storage Explorer Microsoft Azure Storage Explorer is a standalone app that makes it easy to work with Azure Storage data on Windows, macOS, and Linux. In this article, you'll learn several ways of connecting to and managing your Azure storage accounts.

You have big data stored in data warehouse. You need to process the data in Azure Synapse Analytics. Which technology would you use? ​ CPP ​ APP ​ RPP MPP

Azure Synapse uses massively parallel processing (MPP) database technology, which allows it to manage analytical workloads and aggregate and process large volumes of data efficiently.

You have an Azure Storage account and an Azure SQL data warehouse by using Azure Data Factory. The solution must meet the following requirements: 1. Ensure that the data remains in the UK West region at all times. 2. Minimize administrative effort. Which type of integration runtime should you use? ​ Azure integration runtime ​ Self-hosted integration runtime

Azure integration runtime

I have some data that I want to store as a Blob. Because I access this blob multiple times a day, I want the cheapest cost for this blob. Which access tier should I use? A) Cool B) Hot C) Archiv

B) Hot Hot is best for frequent use. Cool is best for less frequent use. Archive is best for rarely used data.

What is OLTP? A) It is an networking protocol. B) It is a database used to save transactions. C) It is the querying language of SQL Server, using the SELECT statement. D) It is an example of a data warehouse

B) It is a database used to save transactions. OLTP stands for Online Transactional Processing. The querying language of SQL Server is T-SQL. A data warehouse is closer to OLAP (Online Analytical Processing).

When provisioning an Azure Cosmos DB account, which feature provides redundancy within an Azure region? A. multi-master replication B. Availability Zones C. automatic failover D. the strong consistency level

B. Availability Zones

You need to ensure that users use multi-factor authentication (MFA) when connecting to an Azure SQL database. Which type of authentication should you use? A. service principal authentication B. Azure Active Directory (Azure AD) authentication C. SQL authentication D. certificate authentication

B. Azure Active Directory (Azure AD) authentication

Which storage solution supports role-based access control (RBAC) at the file and folder level? A. Azure Disk Storage B. Azure Data Lake Storage C. Azure Blob storage D. Azure Queue storage

B. Azure Data Lake Storage

You are writing a set of SQL queries that administrators will use to troubleshoot an Azure SQL database. You need to embed documents and query results into a SQL notebook.What should you use? A. Microsoft SQL Server Management Studio (SSMS) B. Azure Data Studio C. Azure CLI D. Azure PowerShell

B. Azure Data Studio

Which two Azure services can be used to provision Apache Spark clusters? Each correct answer presents a complete solution. (Choose two.)NOTE: Each correct selection is worth one point. A. Azure Time Series Insights B. Azure HDInsight C. Azure Databricks D. Azure Log Analytics

B. Azure HDInsight C. Azure Databricks

Your company needs to implement a relational database in Azure. The solution must minimize ongoing maintenance. Which Azure service should you use? A. Azure HDInsight B. Azure SQL Database C. Azure Cosmos DB D. SQL Server on Azure virtual machines

B. Azure SQL Database

Your company uses several Azure HDInsight clusters. The data engineering team reports several errors with some applications using these clusters. You need to recommend a solution to review the health of the clusters. What should you include in your recommendation? A. Azure Automation B. Log Analytics C. Application Insights D. Azure Diagnostics

B. Log Analytics Azure Monitor logs integration. Azure Monitor logs enables data generated by multiple resources such as HDInsight clusters, to be collected and aggregated in one place to achieve a unified monitoring experience.As a prerequisite, you will need a Log Analytics Workspace to store the collected data. If you have not already created one, you can follow the instructions for creating a Log Analytics Workspace. You can then easily configure an HDInsight cluster to send many workload-specific metrics to Log Analytics.

You have a SQL pool in Azure Synapse Analytics that is only used actively every night for eight hours.You need to minimize the cost of the SQL pool during idle times. The solution must ensure that the data remains intact.What should you do on the SQL pool? A. Scale down the data warehouse units (DWUs). B. Pause the pool. C. Create a user-defined restore point. D. Delete the pool

B. Pause the pool.

Your company plans to load data from a customer relationship management (CRM) system to a data warehouse by using an extract load, and transform (ELT) process. Where does data processing occur for each stage of the ELT process? Please Match the answer with component correctly I. Extract II. Load III. Transform 1. The Data Ware house 2. The CRM System 3. An in memory data integration tool 4. Service Connection You need to match I , II, III correctly with 1,2,3 ? What will be the correct match for Transform (3) ? A. The Data Ware house B. The CRM System C. An in memory data integration tool D. Service Connection

B. The CRM System

You need to develop a solution to provide data to executives. The solution must provide an interactive graphical interface, depict various key performance indicators, and support data exploration by using drill down.What should you use in Microsoft Power BI? A. a dashboard B. a report C. a dataflow D. Microsoft Power Apps

B. a report

Which Azure Data Factory component initiates the execution of a pipeline? A. a control flow B. a trigger C. a parameter D. an activity

B. a trigger

You manage an application that stores data in a shared folder on a Windows server.You need to move the shared folder to Azure Storage.Which type of Azure Storage should you use? A. queue B. blob C. file D. table

B. blob

What are three characteristics of an Online Transaction Processing (OLTP) workload? Each correct answer presents a complete solution. (Choose three.)NOTE: Each correct selection is worth one point. A. denormalized data B. heavy writes and moderate reads C. light writes and heavy reads D. schema on write E. schema on read F. normalized data

B. heavy writes and moderate reads D. schema on write E. schema on read

What is a benefit of hosting a database on Azure SQL managed instance as compared to an Azure SQL database? A. built-in high availability B. native support for cross-database queries and transactions C. system-initiated automatic backups D. support for encryption at rest

B. native support for cross-database queries and transactions

Which of the following is TRUE when it comes to batch workloads? ​ Batch workloads process data in memory, row-by-row ​ Batch workloads collect and process data at most once a day ​ Batch workloads process data as new data is received in real-time ​ Batch workloads collect data and then process the data when a condition is met

Batch workloads process data in memory, row-by-row Normally batch workloads are used to process data row-by-row Option B is incorrect because you can have batch jobs running multiple times a day Option C is incorrect since this is an example of processing streaming data Option D is incorrect since you don't need to process data necessarily based on any condition.

Which of the following statement is correct about Azure SQL database? ​ By default , each Azure SQL database is protected by a network security group ​ By default , each Azure SQL database is protected by a server level firewall ​ By default , each Azure SQL database is protected by Azure Firewall ​ By default , each Azure SQL database is protected by Azure Front Door

By default , each Azure SQL database is protected by a server level firewall An Azure SQL Database is protected by a server-level firewall.

You develop data engineering solutions for a company. You must integrate the company's on-premises Microsoft SQL Server data with Microsoft Azure SQL Database. Data must be transformed incrementally. You need to implement the data integration solution. Which tool should you use to configure a pipeline to copy data? A. Use the Copy Data tool with Blob storage linked service as the source B. Use Azure PowerShell with SQL Server linked service as a source C. Use Azure Data Factory UI with Blob storage linked service as a source D. Use the .NET Data Factory API with Blob storage linked service as the source

C, The Integration Runtime is a customer managed data integration infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments. A linked service defines the information needed for Azure Data Factory to connect to a data resource. We have three resources in this scenario for which linked services are needed:

A company stores data in multiple types of cloud-based databases.You need to design a solution to consolidate data into a single relational database. Ingestion of data will occur at set times each day. What should you recommend? A. SQL Server Migration Assistant B. SQL Data Sync C. Azure Data Factory D. Azure Database Migration Service E. Data Migration Assistant

C. Azure Data Factory C, Source data is stored on different cloud storage and you need migrate into relational database, so only Azure Data Factory can do this task.

Your company has a reporting solution that has paginated reports. The reports query a dimensional model in a data warehouse.Which type of processing does the reporting solution use? A. stream processing B. batch processing C. Online Analytical Processing (OLAP) D. Online Transaction Processing (OLTP)

C. Online Analytical Processing (OLAP)

You have an inventory management database that contains the following table. ProductName Quantity Product1 100 Product2 129 Product3. 176 Which statement should you use in a SQL query to change the inventory quantity of Product1 to 270? A. INSERT B. MERGE C. UPDATE D. CREATE

C. UPDATE

You have a transactional application that stores data in an Azure SQL managed instance. When should you implement a read-only database replica? A. You need to generate reports without affecting the transactional workload. B. You need to audit the transactional application. C. You need to implement high availability in the event of a regional outage. D. You need to improve the recovery point objective (RPO).

C. You need to implement high availability in the event of a regional outage.

Which scenario is an example of a streaming workload? A. sending transactions that are older than a month to an archive B. sending transactions daily from point of sale (POS) devices C. sending telemetry data from edge devices D. sending cloud infrastructure metadata every 30 minutes

C. sending telemetry data from edge devices

At which level in Azure Cosmos DB can you configure multiple write regions and read regions? ​ ·partition ​ database Collection ​ Dataset ​ Account

CollectionCollection is the correct For lower latency, you can enable geo-replication on Cosmos DB, allowing the data to replicate in different Azure regions. By default, there will be a write region, which will accept all the write requests. And all rest of the Azure regions will accept read requests. But if you have write enabled Azure region in West US region, then write requests coming from South East Asia region or Central India region might have high latency. In order to reduce the write latency, you can enable writes on multiple regions. Please note that if you enable multi-region writes, you should understand that conflicts may arise because same data was updated in two different regions. For handling such instances, you may have to apply the conflict resolution policy. This article solely focuses on enabling multi-region writes, it does not talk about conflicts.

You are designing an application that will store petabytes of medical imaging data When the data is first created, the data will be accessed frequently during the first week. After one month, the data must be accessible within 30 seconds, but files will be accessed infrequently. After one year, the data will be accessed infrequently but must be accessible within five minutes. You need to select a storage strategy for the data. The solution must minimize costs. Which storage tier should you use for each time frame? If you would like to access the data after one year, which storage strategy you will use ? ​ Archive ​ Hot ​ Cool

Cool

Which of the following is considered as DDL SQL Commands ? Select one ​ Insert ​ Create ​ Select ​ Update

Create "Create" is the correct Answer DDL or Data Definition Language actually consists of the SQL commands that can be used to define the database schema. It simply deals with descriptions of the database schema and is used to create and modify the structure of database objects in the database. Examples of DDL commands: CREATE - is used to create the database or its objects (like table, index, function, views, store procedure and triggers). DROP - is used to delete objects from the database. ALTER-is used to alter the structure of the database. TRUNCATE-is used to remove all records from a table, including all spaces allocated for the records are removed. COMMENT -is used to add comments to the data dictionary. RENAME -is used to rename an object existing in the database. The SQL commands that deals with the manipulation of data present in the database belong to DML or Data Manipulation Language and this includes most of the SQL statements. Examples of DML: INSERT - is used to insert data into a table. UPDATE - is used to update existing data within a table. DELETE - is used to delete records from a database table.

Your manager asked to create Azure Data Lake Store so that data can be used to querying further, how will you create Azure Data Lake Store in Azure Portal ? Which is the correct way to create Storage Account. a) Add a resource and then search for Azure Data Lake Store , then create it. b) Create an Azure Data Factory and then enable the Azure Data Lake radio button while configuring Azure Data Factory. c) Create an Azure Storage Account and then enable the Azure Data Lake radio button while configuring Azure Data Factory. d) Add a resource and then search Azure Data Lake Analytics and then create it. ​ Add a resource and then search for Azure Data Lake Store , then create it. ​ Create an Azure Data Factory and then enable the Azure Data Lake radio button while configuring Azure Data Factory. ​ Create an Azure Storage Account and then enable the Azure Data Lake radio button while configuring Azure Data Factory. ​ Add a resource and then search Azure Data Lake Analytics and then create it.

Create an Azure Storage Account and then enable the Azure Data Lake radio button while configuring Azure Data Factory.

Which command-line tool can you use to query Azure SQL databases? A. sqlcmd B. bcp C. azdata D. Azure CLI

D. Azure CLI

Which statement is an example of Data Definition Language (DDL)? A. SELECT B. JOIN C. MERGE D. CREATE

D. CREATE

What is correct hierarchy structure for the Azure Storage Account having blobs , container , files in it ? The Azure Storage Account is presented in Azure Resource Group Div. A. Resource Group B. Azure Storage Account C. Blob D. Container E. Data Lake Storage What will be the correct hierarchy among Azure Resources ? A. Azure Storage Account -> Data Lake Storage -> Blob -> Container B. Resource Group -> Azure Storage Account -> Container -> Blob C. Data Lake Storage - >Azure Storage Account -> Blob -> Container D. Resource Group -> Azure Storage Account -> Container -> Blob

D. Resource Group -> Azure Storage Account -> Container -> Blob

You need to gather real-time telemetry data from a mobile application.Which type of workload describes this scenario? A. Online Transaction Processing (OLTP) B. batch C. massively parallel processing (MPP) D. streaming

D. streaming

Which type of SQL commands deal with the management of the database, including creating or altering table schema? ​ DML ​ DCL ​ DDL

DDL is Data Definition Language and is the category of commands used to define the database schema.

Which of the following are common characteristics of a a data warehouse? Choose all that apply. ​ Database optimized for reading by being denomalized Database optimized for writing ​ Supports massive amounts of data ​ Significant pre-processing happens on the data before it can be used for reporting

Database optimized for reading by being denomalized ​ Supports massive amounts of data SQL DW are databases that are specifically designed for reporting. In fact, it does not support updates to data rows once written to the DB. It supports massive amounts of data. The DW has the ability to scale to multiple servers if a complex query is run against it needing a lot of computing power.

What type of analytics answers the question "what happened", such as a sales report for yesterday? ​ Cognitive ​ Predictive ​ Prescriptive ​ Diagnostic ​ Descriptive

Descriptive Descriptive Analytics is entirely based on the data in the database, and tells you the current state of your business. Sales, orders, inventory, visits, etc. See https://www.microsoft.com/en-ca/microsoft-365/business-insights-ideas/resources/benefits-of-business-analytics for more.

Which of the following is the right type of analytics when it comes to transcribing audio files? ​ Cognitive ​ Descriptive ​ Predictive ​ Prescriptive

DescriptiveHere you are getting information about the audio text via transcribing the files.

What type of analytics answers the question "why did it happen", such as comparing sales of first time customers to sales of long-term customers? ​ Predictive ​ Descriptive ​ Diagnostic Prescriptive ​ Cognitive

Diagnostic Diagnostics analytics tries to view the data from multiple angles to try to figure out what is going on. So you may know that sales were up 20% last month, but WHY were they up 20%? Once you start to ask which products are up, which locations were up, which colors were the most popular, then you're starting to diagnose the problem. See https://azure.microsoft.com/en-us/blog/answering-whats-happening-whys-happening-and-what-will-happen-with-iot-analytics/ for more.

Which among the following statements is true with respect to the ETL process? ​ ETL process require target systems to transform the data being loaded. ​ ETL process have low load times ETL process reduces the resource contention on the source systems. ​ ETL process have very high load times

ETL process require target systems to transform the data being loaded. Extract, transform, and load (ETL) is a data pipeline used to collect data from various sources, transform the data according to business rules, and load it into a destination data store. The transformation work in ETL takes place in a specialized engine and often involves using staging tables to temporarily hold data as it is being transformed and ultimately loaded to its destination.

A company is designing a solution that uses Azure Databricks. The solution must be resilient to regional Azure datacenter outages. You need to recommend the redundancy type for the solution. What should you recommend? A. Read-access geo-redundant storage B. Locally-redundant storage C. Geo-redundant storage D. Zone-redundant storage ​

Explanation C, If your storage account has GRS enabled, then your data is durable even in the case of a complete regional outage or a disaster in which the primary region isn't recoverable.

What type of analytics uses advanced data such as a camera sensor embedded in a store manequinn to offer customers who looked at an item but didn't buy it an extra discount by email? ​ Predictive ​ Cognitive ​ Prescriptive ​ Diagnostic ​ Descriptive

Explanation It sounds scary, right? Cameras embedded in malls and shops. But they are already there. A smart programmer is needed to tie the camera input into Azure Cognitive Services to recognize the shopper looking at an item, monitor their purchases based on their store "loyalty points" card, and email them a coupon after a couple of days to get them to buy the item... That's cognitive analytics

You have data stored in ADLS in text format. You need to load the data in Azure Synapse Analytics in a table in one of the databases. Which are the two components that you should define in order to use polybase? ​ Internal source External source ​ External format ​ Internal format

External source ​ External format 1.Create External data sourceCREATE EXTERNAL DATA SOURCE blob1 WITH ( TYPE = Hadoop, CREDENTIAL = cred1, LOCATION = 'wasbs://[email protected]' ); 2.Create an External File Format CREATE EXTERNAL DATA SOURCE blob1 WITH (TYPE = Hadoop, CREDENTIAL = cred1, LOCATION = 'wasbs://[email protected]');

You have to state whether the following is TRUE or FALSE. "Items contained in the same Azure Cosmos DB logical partition can have different partition keys"

FAlse. All items in the same logical partition have the same partition key.

You have to state whether the following is TRUE or FALSE. "A pipeline is a representation of a data structure within Azure Data Factory"

FAlse. The data set in Azure Data Factory is used to represent the data structure.

Which famous company invented the database known as "Cassandra"? ​ Google ​ Amazon ​ Facebook Microsoft

Facebook A Facebook employee invented Cassandra to power their inbox search feature. It was released as an open source project in 2008, and eventually was managed by the Apache Foundation.

A company is developing a mission-critical line of business app that uses Azure SQL Database Managed Instance. You must design a disaster recovery strategy for the solution, You need to ensure that the database automatically recovers when full or partial loss of the Azure SQL Database service occurs in the primary region. What should you recommend? ​ Failover-group Azure SQL Data Sync ​ SQL Replication ​ Active geo-replication

Failover-group

Normalisation involves eliminating relationships between database tables True/False

False

You have to state whether the following is TRUE or FALSE. "Azure Data Studio can be used to restore a database" ​ TRUE ​ FALSE

False Azure Data Studio is not used for the management of SQL databases.

You have to state whether the following is TRUE or FALSE. "Azure SQL Managed Instance can be restored to SQL Server on an Azure virtual machine" ​ TRUE ​ FALSE

False Backups from a managed instance can only be restored to another managed instance. They cannot be restored to a SQL Server instance or to Azure SQL Database.

You have to state whether the following is TRUE or FALSE. "Platform as a service database offering in Azure provide administrators with the ability to control and update the operating system version" ​ TRUE FALSE

False In Platform as a service, the infrastructure is managed by Azure. Hence administrators don't have control over the underlying infrastructure.

You have to state whether the following is TRUE or FALSE. "Microsoft SQL Server Management Studio enables users to create and use SQL notebooks" ​ TRUE ​ FALSE

False SQL Notebooks are available in Azure Data Studio

You have to state whether the following is TRUE or FALSE. "All platform as a service database offering in Azure can be paused to reduce costs" ​ TRUE FALSE

False This is not TRUE for all platform as a service offering. There are serverless offerings that allow you to pause the database. This is part of the Serverless Compute tier. You can use the below link to refer more on the serverless offering.

You have to state whether the following is TRUE or FALSE. "Azure Table storage supports multiple write regions" ​ TRUE ​ FALSE

False You can only have one write region when it comes to Azure table storage

You have to state whether the following is TRUE or FALSE. "The Azure Cosmos DB API is configured separately for each database in an Azure Cosmos DB account"

False. The Cosmos DB API is configured at the account level and not at the database level.

Which of the following are common characteristics of a relational database? Choose all that apply. ​ Stored in JSON format ​ Data stored in CSV files in a blob storage account ​ Foreign Keys ​ Data is stored in a table format of rows and columns ​ Database prevents deletion of a parent record if a child record exists

Foreign Keys ​ Data is stored in a table format of rows and columns ​ Database prevents deletion of a parent record if a child record exists Relational databases have features that commonly include tables, views, a primary key, foreign keys, indexes, relationships between the tables, integrity enforced by the database. See https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-overview#relational-database-management-systems for more.

Employee data that shows the relationships between employees Graph Object Key/Value

Graph

Which API of cosmos DB can be used to work with data which has many different entities that share a relationship? ​ CassandraDB ​ MongoDB ​ Gremlin ​ Table

Gremlin Choosing Gremlin as the API provides a graph-based view of the data. Remember that at the lowest level, all data in any Azure Cosmos DB is stored in an ARS format. A graph-based view on the database means data is either a vertex (which is an individual item in the database) or an edge (which is a relationship between items in the database).

Cosmos DB has multi API programming model support. Which among the following is suitable for contact tracing of covid19 patients? ​ Gremlin API ​ Cassandra API ​ Table API ​ SQL API

Gremlin API Azure Cosmos DB provides a graph database service via the Gremlin API on a fully managed database service designed for any scale. Gremlin API and explains how to use them to store massive graphs with billions of vertices and edges.

What format is the ARM template stored in? ​ XML JSON ​ CSV ​ VHD

JSON ARM templates are text files that contain the definitions of resources to deploy, and they are in JSON (JavaScript Object Notation) format.

Application users and their default language Graph Object Key/Value

Key/Value

Which of the following are examples of unstructured data? ​ Log files ​ CSV files ​ SQL Server tables ​ An external Oracle DB ​ Blob storage

Log files CSV files Blob storage Unstructured data (or unstructured information) is information that either does not have a pre-defined data model or is not organized in a pre-defined manner. Unstructured information is typically text-heavy, but may contain data such as dates, numbers, and facts as well.

Which of the following would you use for the following requirement? "A development tool for building Azure SQL Databases, Microsoft SQL Server relational databases, SQL Server Analysis services data models, SQL Server Integration Services packages and SQL Server Reporting Services reports" ​ Azure Data Studio ​ Microsoft SQL Server Data Tool Microsoft SQL Server Management Studio Microsoft Visual Studio Code

Microsoft SQL Server Data Tool

You need to design and model a database by using a graphical tool that supports project-oriented offline database development.What should you use? ​ Azure Data Studio ​ Microsoft SQL Server Management Studio (SSMS) ​ Microsoft SQL Server Data Tools (SSDT) Azure Databricks

Microsoft SQL Server Data Tools (SSDT) Using SSDT, you can create an offline database project and implement schema changes by adding, modifying, or deleting the definitions of objects (represented by scripts) in the project, without a connection to a server instance.

You have to design and model a database by using a graphical tool that supports object-oriented offline database development. Which of the following could you use for this requirement? ​ Microsoft SQL Server Data Tools (SSDT)​ Microsoft SQL Server Management Studio (SSMS) ​ Azure Databricks ​ Azure Data Studio

Microsoft SQL Server Data Tools (SSDT)​ The SQL Server Data tools can be used for offline database development

Which of the following would you use for the following requirement? "A graphical tool for managing Azure SQL databases and viewing SQL execution plans" ​ Azure Data Studio ​ Microsoft SQL Server Data Tools ​ Microsoft SQL Server Management Studio ​ Microsoft Visual Studio Code Explanation

Microsoft SQL Server Management Studio Microsoft SQL Server Management Studio is use for managing Azure SQL databases

Which of the following would you use for the following requirement? "A Microsoft SQL Server extension that supports connections to SQL Server and provides a rich editing experience for T-SQL" ​Azure Data Studio Microsoft SQL Server Data Tools ​Microsoft SQL Server Management Studio Microsoft Visual Studio Code

Microsoft Visual Studio Code You can use Microsoft Visual Studio Code that has a SQL Server extension for working with SQL Server databases.

Given Below is Json you need to identify the json objects type that will be used in one of the API Azure Cosmos DB { "emp1" : { "EmpName" : "Chris Jackman", "EmpAge" : "34", "Company Code" : { "Code" : "10" } {"onetype":[ {"id":1,"name":"John Doe"}, {"id":2,"name":"Don Joeh"} ] } } What will be the correct Object type for onetype ? ​ Nested Object, ​ Nested array root object

Nested array

Azure Table Storage supports multiple write regions Yes/No

No

III. Azure SQL managed instance supports ,SQL can be restore to sql server running on vm in azure Yes/No

No

You must apply patches to Azure SQL databases regularly Yes/No

No

You need a Microsoft 365 subscription to create an Azure SQL database Yes/No

No

You have a Azure SQL Database server named "server1", and an elastic database pool on it called "pool1". There is a database in the pool named "db1". Your development team wants to add a second database to the same pool called "db2". What is the affect of adding a second database to an existing elastic database pool on costs? ​ You pay for the database, and so adding a second database will increase the monthly cost. ​ No effect. You pay for the pool, and can run 100+ databases on it for the same price.

No effect. You pay for the pool, and can run 100+ databases on it for the same price. For elastic pool databases, you pay per pool. So adding an additional database will not increase the cost

III. Imagine that you're part of a team that is analyzing house price data. The dataset that you receive contains house price information for several regions. Your team needs to report on how the house prices in each region have varied over the last few months. To achieve this, you need to ingest the data into Azure Synapse Analytics. You've decided to use Azure Datawarehouse to perform this task ? Yes/No?

No the correct statement is this To achieve this, you need to ingest the data into Azure Synapse Analytics. You've need to use Azure Data Factory to perform this task

If you set up your SQL Database networking with "public endpoint", without any other actions, which type of user can connect to the database? ​ Admin users ​ Public users with the correct access username and passord ​ No users ​ Private users

No users Even with a public endpoint, SQL Database needs to have it's firewall configured to allow anyone in. By default, all access attempts are denied unless explicitly added to the firewall access list.

You have a transactional workload running on a relational database system. You need to remove all DML anomalies which hamper the integrity of the databases. What would you do in such a scenario? ​ Block all DML queries ​ Normalize the tables as much as possible ​ Remove relationships in the tables ​ De-normalize the tables as much as possible

Normalize the tables as much as possible

Which of the following Azure database services has guaranteed 100% compatibility with SQL Server running in your own environment? ​ SQL Managed Instance ​ Azure SQL Database Elastic Pool ​ SQL Server in a VM ​ Azure SQL Database

SQL Server in a VM The version of SQL Server software that you'd install yourself in a VM is identical to the software you'd install in your own environment. And has 100% compatibility.

Medical images and their associated metadata Graph Object Key/Value

Object

Which Power BI tool helps build paginated reports? ​ Power BI Server ​ Power BI Desktop ​ Power BI Report Builder ​ Power BI Apps

Power BI Report Builder Power BI Report Builder is the tool that allows you to build paginated reports.

What type of analytics answers the question "what will happen", such as a forecast sales report for next month? ​ Cognitive ​ Prescriptive ​ Predictive ​ Diagnostic ​ Descriptive

Predictive Predictive analytics tries to predict the future. Based on some historical trend, it tries to follow that trend and guess at what will happen. Like knowing that sales are usually 20% higher in October, based on previous years, it will predict the coming October sales assuming that increase. See https://azure.microsoft.com/en-us/blog/answering-whats-happening-whys-happening-and-what-will-happen-with-iot-analytics/ for more.

The production workload is facing the issue of surplus goods. You need to analyze the historical data to determine the requirement for the next month. What type of analysis does this come under? ​ Cognitive analysis ​ Prescriptive analysis ​ Predictive analysis ​ Diagnostic analysis

Predictive analysis Predictive analytics describes the use of statistics and modeling to determine future performance based on current and historical data.

What type of analytics answers the question "what should I do", such as recommending what movie to watch next? ​ Prescriptive ​ Diagnostic ​ Descriptive ​ Predictive ​ Cognitive

Prescriptive This is an evolving field, obviously. But any report that recommends an action to be taken is prescriptive. For instance, if your application could produce a report of customers that should be called by a customer service rep based on the machine learning model believing they are high value and at risk of going to a competitor would be quite advanced and valuable. See https://azure.microsoft.com/en-us/blog/answering-whats-happening-whys-happening-and-what-will-happen-with-iot-analytics/.

What is the main benefit of normalization? ​ SELECT queries are faster ​ Forces users to select from a fixed list of choices instead of using a free-form field ​ Reduces common typos and errors on inputs ​ Reduces data duplication

Reduces common typos and errors on inputs ​ Reduces data duplication Normalization is the process of breaking a data field down to its composite parts. For instance, instead of having a single column called "address", you could have five columns called "street address", "city", "state", "country", and "zip code". Then, for columns that can only contain a limited number of different values, you convert that into a "lookup table" and replace the value with an ID field, thus reducing duplication. Finally, for certain text fields, if you use a lookup table instead of making it a free-form text field, you could force users to select from predefined values (such as state names) instead of having 15 different spellings of "New York State".

Given Below is Json you need to identify the json objects type that will be used in one of the API Azure Cosmos DB { "emp1" : { "EmpName" : "Chris Jackman", "EmpAge" : "34", "Company Code" : { "Code" : "10" } {"onetype":[ {"id":1,"name":"John Doe"}, {"id":2,"name":"Don Joeh"} ] } } What will be the correct Object type for emp1 ? ​ Nested Object ​ ·Nested array, ​ Root object

Root object Root object is the Correct Answer The API will be used in Cosmos DB like Document API, SQL API etc

Which of the following Azure database services has almost 100% compatibility with SQL Server running in your own environment? ​ Table Storage ​ Azure SQL Database ​ SQL Managed Instance ​ Azure SQL Database Elastic Pool

SQL Managed Instance SQL Server Managed Instance is a fully-managed database product that Azure offers that has very close compatibility to SQL Server running in your own environment. There are a few things that are not supported, but those are relatively rare.

You need to select the Online transaction processing (OLTP) properties for your solution. Given below are some of the properties for Typical traits of transactional data and you need to select the correct ones for OLTP ? ​ Schema on write Highly normalized Light write Heavy write Denormalized Schema on read

Schema on write Highly normalized Heavy write

Which service in Azure can be used to process the data in real-time having three components: input, query, and output? ​ Event hub ​ Synapse Analytics ​ Azure Databricks ​ Stream Analytics Job ​ IoT hub

Stream Analytics Job An Azure Stream Analytics job consists of an input, query and an ouput. Stream Analytics ingests data from Azure Event Hubs (including Azure Event Hubs with Apache Kafka), Azure IoT Hub, or Azure Blob Storage.

You have a large amount of data held in files in Azure Data Lake storage. You want to retrieve the data in these files and use it to populate tables held in Azure Synapse Analytics. Which processing option is most appropriate? ​ Synapse Spark pool ​ Synapse SQL pool Use Azure Synapse Link to connect to Azure Data Lake storage and download the data ​ Synapse Spark

Synapse SQL pool Using PolyBase from a SQL pool to connect to the files in Azure Data Lake as external tables, and then ingest the data.

Which of the components of Azure Synapse Analytics allows you to train AI models using AzureML? ​ Synapse Studio ​ Synapse Spark ​ Synapse Pipelines

Synapse Spark We can use a notebook to ingest and shape data, and then use SparkML and AzureML to train models with it.

What is the name of the SQL used by SQL Server? ​ SQL/PSM ​ PL/SQL ​ T-SQL ​ PL/pgSQL

T-SQL SQL Server runs on T-SQL which standard for Transact SQL.

Your company plans to load data from a customer relationship management (CRM) system to a data warehouse by using an extract load, and transform (ELT) process. Where does data processing occur for each stage of the ELT process? Please Match the answer with component correctly I. Extract II. Load III. Transform 1. The Data Ware house 2. The CRM System 3. An in memory data integration tool 4. Service Connection You need to match I , II, III correctly with 1,2,3 ? What will be the correct match for Extract (I) ? ​ The Datawarehouse ​ The CRM System ​ An in memory data integration tool ​ Service Connection

The Datawarehouse ​The service that brings together enterprise data warehousing and Big Data analytics. It gives you the freedom to query data on your terms, using either serverless or provisioned resources—at scale

Normalisation improves data integrity True/False

True

Normalising a database reduces data redundancy True/False

True

You need to map the appropriate Azure Cosmos DB API's to the appropriate data structures Which of the following would you map to the JSON data structure? 1. Cassandra API 2. Gremlin API 3. MongoDB API 4. Table API

The MongoDB API supports the usage of documents that can be used to store JSON data.

Which of the following is a command line tool that can be used to query Azure SQL databases? 1. sqlcmd 2. bcp 3. azdata 4. Azure CLI

The sqlcmd tool can be used to query Azure SQL databases.

our company has to design a data store that will be used to store data from Internet connected temperature sensors. The collected data needs to be used to analyze temperature trends. Which of the following would you choose as the data store type? ​ Relational ​ Columnar ​ Graph ​ Time series

Time Series Here you would be recording temperature over time. Hence its best to store the data in a time series format.

What is the primary purpose of an index on a relational data table? ​ It is a simplified view of a table, returning the same data but fewer columns for instance. ​ To speed up INSERT statements so data is written to the table faster. ​ To speed up SELECT queries so that they return faster. ​ It is a child table that uses a foreign key to refer to the primary table.

To speed up SELECT queries so that they return faster. Indexes speed up "read" operations, allowing the database to avoid having to search the entire table when looking for something.

Which of the following is correct when it comes to Transparent Data Encryption? ​ Transparent Data Encryption encrypts a column to protect data at rest and in transit ​ Transparent Data Encryption encrypts queries and their results in order to protect data in transit ​ Transparent Data Encryption encrypts the database to protect data at rest ​ Transparent Data Encryption encrypts the server to protect data at rest

Transparent Data Encryption encrypts the database to protect data at rest ​ Transparent Data Encryption is used at the database level to encrypt the data at rest.

You have to state whether the following is TRUE or FALSE. "You can use existing Microsoft SQL Server licenses to reduce the cost of Azure SQL databases " ​ TRUE ​ FALSE

True Yes, you can use an existing SQL Server license to get a discount on Azure SQL databases.

You have to state whether the following is TRUE or FALSE. "A Microsoft Power BI dashboard is associated with a single workspace"

True es , the workspace is a logical boundary for your reports and data

ou have to state whether the following is TRUE or FALSE. "If you have a platform as a service database in Azure, backups are performed automatically" ​ TRUE ​ FALSE

True.,Yes , the backups are taken automatically by the service

Azure Data Factory supports a trigger that is scheduled at a predetermined time, but can pretend it is running at another time. For instance, a job runs every day at NOON, but only processes the data received until midnight last night. What type of trigger is this called? ​ Scheduled trigger ​ Time-series interval ​ Tumbling window trigger ​ Manual trigger

Tumbling window trigger A tumbling window can run a job using data for a specific period of time, and not before or after that.

Azure SQL Database can use Azure Advanced Threat Protection (ATP) Yes/No

Yes

Azure SQL Database has a built-in high availability Yes/No

Yes

Azure SQL Database includes a fully managed backup service Yes/No

Yes

Azure Table storage supports multiple read replicas Yes/No

Yes

PaaS database offerings in Azure provide built-in high availability Yes/No

Yes

PaaS database offerings in Azure provide configurable scaling options Yes/No

Yes

PaaS database offerings in Azure reduce the administrative overheard for managing hardware Yes/No

Yes

Select Yes if the statement is true. Otherwise, Select No Does Azure SQL database use Azure Advanced Threat Protection? ​ Yes ​ No

Yes

The Azure Cosmos DB Table API supports multiple read replicas Yes/No

Yes

The Azure Cosmos DB table API supports multiple write regions Yes/No

Yes

When ingesting data from Azure Data Lake Storage across Azure regions, you will incur costs for bandwidth Yes/No

Yes

You can use Azure Data Studio to query a Microsoft SQL Server big data cluster Yes/No

Yes

You can use Microsoft SQL Server Management Studio (SSMS) to query an Azure Synapse Analytics data warehouse Yes/No

Yes

You can use MySQL Workbench to query Azure Database for MariaDB databases. Yes/No

Yes

You can use blob, table and file storage in the same Azure Storage account Yes/No

Yes

You can use existing Microsoft SQL Server licenses to reduce the cost of Azure SQL databases Yes/No

Yes

You implement Azure Data Lake Storage by creating an Azure Storage account Yes/No

Yes

ii. Azure SQL managed instance supports cross db Queries Yes/No

Yes

Select Yes if the statement is true. Otherwise, Select No. 1. Azure Databricks is an apache spark based collaborative analytics platform? (Yes/No) 2. Azure Analysis Services is used for transactional workloads? (Yes/No) 3. Azure Data Factory is a data ingestion tool (Yes/No) ​

Yes No Yes Azure Databricks provides a fast, easy, and collaborative Apache Spark-based analytics platform to accelerate and simplify the process of building Big Data and AI solutions. Azure Analysis Services is a fully managed platform as a service (PaaS) that provides enterprise-grade data models in the cloud. Use advanced mashup and modeling features to combine data from multiple data sources, define metrics, and secure your data in a single, trusted tabular semantic data model. Azure Data Factory is very useful on the Azure Platform if you are planning to ingest your data.

Select Yes if the statement is true. Otherwise, Select No. 1. You can use MySQL Workbench to quey Azure Database for MariaDB databases? (Yes/No) 2. You can use Microsft SQL Server Management Studio (SSMS) to query Azure Synapse Analytics Data warehouse ? (Yes/No) 3. You can use Azure Data Studio to query the Microsft SQL server big data cluster? (Yes/No)

Yes Yes Yes MariaDB https://docs.microsoft.com/en-us/azure/mariadb/connect-workbench Azure Data Studio https://docs.microsoft.com/en-us/sql/azure-data-studio/what-is?view=sql-server-ver15 SSMS https://docs.microsoft.com/en-us/learn/modules/query-azure-sql-data-warehouse/4-query-dw-using-ssms Continue Retake test

Which of the following statement is valid and which Is false ? I. A Azure Data Factory pipeline can pass parameters to a notebook ? Yes/No?

Yes this full fill the requirements . We can parameterized pipeline in Data factory

Which among the following statement(s) is/are true with respect to the Azure SQL database? ​ You must apply patches to the Azure SQL Databases regularly ​ You can use existing Microsoft SQL Server licenses to reduce the cost of Azure SQL databases. ​ You need a Microsoft 365 subscription to create an Azure SQL Database ​ None of the above

You can use existing Microsoft SQL Server licenses to reduce the cost of Azure SQL databases. We can use SQL Server licenses with Software Assurance and save up to 55 percent over pay-as-you-go pricing on SQL Database.

ou need to query a table named Products in an Azure SQL database.Which three requirements must be met to query the table from the internet? Each correct answer presents part of the solution. (Choose three.)NOTE: Each correct selection is worth one point. ​ You must be assigned the Reader role for the resource group that contains the database You must have SELECT access to the Products table ​ Your IP address must be allowed to connect to the database ​ You must be assigned the Contributor role for the resource group that contains the database. ​ You must have a user in the database.

You must be assigned the Reader role for the resource group that contains the database ​ Your IP address must be allowed to connect to the database ​ You must have a user in the database.

You have an Azure SQL Database. You need to query a table named Products. Which of the following are requirements that need to be fulfilled to access the table? Choose 3 answers from the options given below ​ You must be assigned the Reader role for the resource group that contains the database. ​ You must have SELECT access to the Products table.​ You must have a user in the database. ​ You must be assigned the Contributor role for the resource group that contains the database. ​ Your IP address must be allowed to connect to the database.

You must have SELECT access to the Products table.​ You must have a user in the database. ​ Your IP address must be allowed to connect to the database. Firstly, your workstation IP address must be added to the firewall rule for the Azure SQL database server to access the database Then there must a user in the database that would actually be able to connect to the database itself. Then the user must also be granted the required privileges to use the SELECT statement

You have a transactional application that stores data in an Azure SQL managed instance.When should you implement a read-only database replica? ​ You need to improve the recovery point objective (RPO). ​ You need to audit the transactional application. ​ You need to implement high availability in the event of a regional outage. ​ You need to generate reports without affecting the transactional workload.

You need to implement high availability in the event of a regional outage. ​ The read-only replica will run large SELECTS, GROUPBY, PARTITION BYs to generate reports on that data.

You have a Azure SQL Database server named "server1", and a single database on it called "db1". Your development team wants to add a second single database to the same server called "db2". What is the affect of adding a second database to an existing server on costs? ​ No effect. You pay for the server, and can run an unlimited number of databases on it for the same price. ​ You pay for the database, and so adding a second database will increase the monthly cost.

You pay for the database, and so adding a second database will increase the monthly cost. For single databases, you pay per database. So adding an additional database will increase the cost, even if it's the same "server".

By default, each Azure SQL database is protected by ______________________ ​ NSG (Network Security Group) ​ a server-level firewall Azure Firewall

a server-level firewall An azure firewall is a different service that you need to provision. When you need to update firewall rules for my SQL DB on azure I have to go to the server level, not the database level.

Which three objects can be added to a Microsoft Power BI dashboard? Each correct answer presents a complete solution. (Choose three) NOTE: Each correct selection is worth one point. ​ an image ​ a visualization from a report ​ a Microsoft PowerPoint slide ​ a dataflow ​ a report page

an image ​ a visualization from a report a report page

An ETL (Extract Transform Load) process requires ___________________________ ​ that the data target to be relational database ​ a matching schema in the data source and the data target ​ a target data store powerful enough to transform data ​ data that is fully processed before being loaded to the taget data store

data that is fully processed before being loaded to the target data store

Which type of non-relational data store supports a flexible schema, stores data as JSON files, and stores the all the data for an entity in the same document? ​ document ​ graph ​ columnar ​ time series

document A document database is a type of nonrelational database that is designed to store and query data as JSON-like documents.

Which of the following are characteristics of an Online Transaction Processing (OLTP) workload? Choose 3 answers from the options given below ​ denormalized data ​ heavy writes and moderate reads ​ light writes and heavy reads ​ schema on write ​ schema on read normalized data

heavy writes and moderate reads ​ schema on read normalized data For OLTP workloads, you gave 1) A lot of writes on the data and moderate reads. You will have more reads in OLAP workloads 2) You will only perform reads on the schema 3) Here the data will be normalized

What is the benefit of hosting a database on an Azure SQL managed instance as compared to an Azure SQL database? ​ native support for cross-database queries and transactions ​ built-in high availability ​ system-initiated automatic backups ​ support for encryption at rest

native support for cross-database queries and transactions

What is the benefit of hosting a database on an Azure SQL managed instance as compared to an Azure SQL database? ​ system-initiated automatic backups ​ native support for cross-database queries and transactions ​ built-in high availability ​ support for encryption at rest

native support for cross-database queries and transactions

An application will use Microsoft Azure Cosmos DB as its data solution. The application will use the Cassandra API to support a column-based database type that uses containers to store items. You need to provision Azure Cosmos DB. Which container name and item name should you use? ​ collection and row rows and table ​ graph and rows ​ entities and collection

rows and table Depending on the choice of the API, an Azure Cosmos item can represent either a document in a collection, a row in a table or a node/edge in a graph.

You need to recommend a security solution for containers in Azure Blob storage. The solution must ensure that only read permissions are granted to a specific user for a specific container. What should you include in the recommendation? ​ shared access signatures (SAS) ​ an RBAC role in Azure Active Directory (Azure AD) ​ · public read access for blobs only ​ access key

shared access signatures (SAS) A, You can delegate access to read, write, and delete operations on blob containers, tables, queues, and file shares that are not permitted with a service SAS. Note: A shared access signature (SAS) provides secure delegated access to resources in your storage account without compromising the security of your data. With a SAS, you have granular control over how a client can access your data. You can control what resources the client may access, what permissions they have on those resources, and how long the SAS is valid, among other parameters.

Which command-line tool can you use to query Azure SQL databases? ​ sqlcmd ​ bcp ​ azdata ​ Azure CLI

sqlcmd Azure CLI is used to configure Azure SQL DB and managed instance. You can't execute queries sqlcmd utility lets you enter Transact-SQL statements, system procedures, and script files

A relational database must be used when __________________________________ ​ a dynamic schema is required ​ storing large images and videos ​ data will be stored as key/value pairs ​ strong consistency guarantees are required

strong consistency guarantees are required

Transparent Data Encryption (TDE) encrypts ________________________________ ​ the database to protect data at rest ​ the server to protect data at rest ​ a column to protect data at rest and in-transit ​ queries and their resultset in order to protect data at rest and in-transit

the database to protect data at rest

ou have an Azure Cosmos DB account that uses the Core (SQL) API.Which two settings can you configure at the container level? Each correct answer presents a complete solution. (Choose two.)NOTE: Each correct selection is worth one point. ​ the API ​ the partition key ​ the read region ​ the throughput

the partition key the throughput

Which of the following is based on column family database ? ​ Gremlin ​ Apache Cassandra ​ · SQL ​ Table API

​ Apache Cassandra ​ B, The most widely used column family database management system is Apache Cassandra. Azure Cosmos DB supports the column-familiy approach through the Cassandra API. Azure Cosmos DB supports graph databases using the Gremlin API. The Gremlin API is a standard language for creating and querying graphs. Azure Table storage is an example of a key-value store. Cosmos DB also implements a key-value store using the Table API.

A company needs to ensure that an application hosted on an Azure virtual machine can connect to the Azure SQL Database without the need to expose the database to internet. Which of the following can be used for this requirement? ​ Azure DNS ​ Azure Application Gateway ​ Azure Private link ​ Azure Traffic Manager

​ Azure Private link Azure Private Link allows your virtual machines to securely connect to the Azure SQL databases via Private IP addresses.

Which is the absolute cheapest way to store data in Azure? ​ SQL Database ​ Blob Storage​ Table Storage ​ Cosmos DB

​ Blob Storage​ Blob Storage costs around 2 cents per GB, which is as cheap as you can find.

SQL Database supports two purchasing models. What are they? ​ Pay per hour and pay per month ​ ACUs and CPUs ​ DTUs and vCore ​ Standard and Premium

​ DTUs and vCore SQL Database supports both DTU-based and vCore-based pricing models. DTU stands for Database Transaction Unit and is a relative measure of performance. The vCore model lets you select CPU and storage separately from each other.

Which type of database workload is best for data that is historical and not used in your day-to-day business, going back 10 or 15 years? ​ OLAP ​ OLTP ​ Data warehouse

​ Data warehouse Data Warehouses have a couple of uses, but one of them is as a massive data storage that includes all the historical data. You might want to trim (or archive) OLTP data for day-to-day operations, but leave the old historical data in the data warehouse.

You need to identify correct DML(Data Manipulation Language) commands from the below , select three ​ Create ​ · Select ​ Insert ​ Update ​ Delete ​ Drop

​ Insert ​ Update ​ Delete DML(Data Manipulation Language) : The SQL commands that deals with the manipulation of data present in the database belong to DML or Data Manipulation Language and this includes most of the SQL statements. Examples of DML: INSERT - is used to insert data into a table. UPDATE - is used to update existing data within a table. DELETE - is used to delete records from a database table. DQL (Data Query Language) : DML statements are used for performing queries on the data within schema objects. The purpose of DQL Command is to get some schema relation based on the query passed to it. Example of DQL: SELECT - is used to retrieve data from the a database. DML(Data Manipulation Language) : The SQL commands that deals with the manipulation of data present in the database belong to DML or Data Manipulation Language and this includes most of the SQL statements. DDL(Data Definition Language) : DDL or Data Definition Language actually consists of the SQL commands that can be used to define the database schema. It simply deals with descriptions of the database schema and is used to create and modify the structure of database objects in the database. Examples of DDL commands: CREATE - is used to create the database or its objects (like table, index, function, views, store procedure and triggers). DROP - is used to delete objects from the database. ALTER-is used to alter the structure of the database. TRUNCATE-is used to remove all records from a table, including all spaces allocated for the records are removed. COMMENT -is used to add comments to the data dictionary. RENAME -is used to rename an object existing in the database.

When might you use PolyBase? ​ To query data from external data sources from Azure Synapse Analytics ​ To orchestrate activities in Azure Data Factory ​ To ingest streaming data using Azure Databricks

​ To query data from external data sources from Azure Synapse Analytics ADF further enriches the PolyBase integration to support:- Loading data from Azure Data Lake Storage Gen2 with account key or managed identity authentication; Loading data from Azure Blob configured with VNet service endpoint, either as the original source or as a staging store, in which case underneath ADF automatically switches to abfss:// scheme to create an external data source as required by PolyBase.


Ensembles d'études connexes

Mental Health HESI Review Questions

View Set

AP Psychology Review: Sleep and Dreaming

View Set

MAT 243 Test 3 Quiz Question Prep

View Set