New 70-776 Exam Dumps Collection from PassLeader in VCE and PDF Files (Question 31 – Question 35)

Valid 70-776 Dumps shared by PassLeader for Helping Passing 70-776 Exam! PassLeader now offer the newest 70-776 VCE dumps and 70-776 PDF dumps, the PassLeader 70-776 exam questions have been updated and ANSWERS have been corrected, get the newest PassLeader 70-776 dumps with VCE and PDF here: https://www.passleader.com/70-776.html (75 Q&As Dumps)

BTW, DOWNLOAD part of PassLeader 70-776 dumps from Cloud Storage: https://drive.google.com/open?id=1R1iVzArLOI8VIIsY-pYYTq5ktJzHBkJR

QUESTION 31
Note: This question is part of a series of questions that present the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.
Start of repeated scenario.
You are migrating an existing on-premises data warehouse named LocalDW to Microsoft Azure. You will use an Azure SQL data warehouse named AzureDW for data storage and an Azure Data Factory named AzureDF for extract, transformation, and load (ETL) functions. For each table in LocalDW, you create a table in AzureDW. On the on-premises network, you have a Data Management Gateway. Some source data is stored in Azure Blob storage. Some source data is stored on an on-premises Microsoft SQL Server instance. The instance has a table named Table1. After data is processed by using AzureDF, the data must be archived and accessible forever. The archived data must meet a Service Level Agreement (SLA) for availability of 99 percent. If an Azure region fails, the archived data must be available for reading always.
End of repeated scenario.
You need to connect AzureDF to the storage account. What should you create?

A.    a gateway
B.    a dataset
C.    a linked service
D.    a pipeline

Answer: C
Explanation:
https://docs.microsoft.com/en-us/azure/data-factory/v1/data-factory-azure-blob-connector

QUESTION 32
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution. After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are monitoring user queries to a Microsoft Azure SQL data warehouse that has six compute nodes. You discover that compute node utilization is uneven. The rows_processed column from sys.dm_pdw_workers shows a significant variation in the number of rows being moved among the distributions for the same table for the same query. You need to ensure that the load is distributed evenly across the compute nodes.
Solution: You add a clustered columnstore index.
Does this meet the goal?

A.    Yes
B.    No

Answer: B

QUESTION 33
You have a Microsoft Azure subscription that contains an Azure Data Factory pipeline. You have an RSS feed that is published on a public website. You need to configure the RSS feed as a data source for the pipeline. Which type of linked service should you use?

A.    Web
B.    OData
C.    Azure Search
D.    Azure Data Lake Store

Answer: A

QUESTION 34
You have sensor devices that report data to Microsoft Azure Stream Analytics. Each sensor reports data several times per second. You need to create a live dashboard in Microsoft Power BI that shows the performance of the sensor devices. The solution must minimize lag when visualizing the data. Which function should you use for the time-series data element?

A.    LAG
B.    SlidingWindow
C.    System.TimeStamp
D.    TumblingWindow

Answer: D

QUESTION 35
You have a Microsoft Azure SQL data warehouse that has 10 compute nodes. You need to export 10 TB of data from a data warehouse table to several new flat files in Azure Blob storage. The solution must maximize the use of the available compute nodes. What should you do?

A.    Use the bcp utility.
B.    Execute the CREATE EXTERNAL TABLE AS SELECT statement.
C.    Create a Microsoft SQL Server Integration Services (SSIS) package that has a data flow task.
D.    Create a Microsoft SQL Server Integration Services (SSIS) package that has an SSIS Azure Blob Storage task.

Answer: D


Get the newest PassLeader 70-776 VCE dumps here: https://www.passleader.com/70-776.html (75 Q&As Dumps)

And, DOWNLOAD the newest PassLeader 70-776 PDF dumps from Cloud Storage for free: https://drive.google.com/open?id=1R1iVzArLOI8VIIsY-pYYTq5ktJzHBkJR