Categories
Uncategorized

10 Sample Questions for Microsoft Associate Level Certification DP-200 Implementing Azure Data Solutions exam

Hello Readers! Here are some questions from the Microsoft Associate level Certification DP-200 Implementing Azure Data Solutions exam.
Answer the following questions to test your knowledge after you have read the course material. The correct answers can be found at the end.
Here’s the Quiz:
Which cloud technical requirement is it to duplicate customer content in Azure for redundancy or meet Service Level Agreements within Azure? Maintainabilityb. High-Availabilityc. Multi-lingual supportd. Scalability
What data platform technology can be used to create a global distributed multi-model database that can provide sub-second query performance? Azure SQL Databaseb. Azure Cosmos DBc. Azure Synapse analyticsd. Blog Storage
Which Azure Data Platform technology can be used to process data within an ELT framework? Azure Data Factoryb. Azure Data Lakec. Azure Databricksd. Azure Cosmos DB
Let’s say you have two video files that are stored as blobs. One of the videos is critical and requires a replication policy to create multiple copies across different data centers. The other video is not critical and can be replicated locally. True or False: These constraints will be satisfied if the two blobs are stored in separate storage accounts. Trueb. False
Dave is creating an Azure Data Lake Storage Gen 2 Account. This account must be configured to allow him to process analytical data workloads at the highest performance. Which option should he choose when creating a storage account? Set the performance to ONb on the Basic Tab. Set the Hierarchical Namespace on the Advanced tab to enabledc Set the performance option to standardd on the Basic Tab. The advanced tab sets the performance to On
How does Spark connect with databases like MySQL, Hive and other data stores?a. JDBCb. ODBCc. Using the REST API Layer
By default, how are corrupt records dealt with using:spark.read.json()a. They are found in a column called “_corrupt_record”b. They are automatically deletedc. They throw an exception, exiting the read operation
Why do you chain methods:(operations) myDataFrameDF.select().filter().groupBy()?a. to obtain accurate resultsb. to avoid temporary DataFrames being used as local variablesc. It is the only way to do this
Authentication for an Event Hub is defined using a combination of an Event Publisher, and which other components?a. Transport Layer Security v1.2b. Shared Access Signaturec. Key to the Storage Account
Azure Synapse Analytics is showing an error message. You want to see information about the service and help solve the problem. What can you do to quickly check the availability? Troubleshoot and diagnose problemsb. Azure monitorc. Monitor network performance
You can find the correct answers here
b.Explanation. High availability duplicates customers’ content to redundancy and meet Service Level Agreements in Azure.
b.Explanation. Azure Cosmos DB, a global distributed multi-model database, can provide sub-second query performance.
Explanation: Azure Data Factory is a cloud integration tool that orchestrates data movement between different data stores.
Explanation: A replication policy is a characteristic for a storage account. The policy must be used by all members of the storage account. Two storage accounts are required if you need data to use the georeplication strategy and another data to use local replication strategy.
b.Explanation
a.Explanation: JDBC. JDBC stands Java Database Connectivity. It is a Java API that allows you to connect to databases like MySQL, Hive, or other data stores. ODBC is not an option. The REST API is.