New Year Sale ! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft 70-475 Exam Questions

Status: RETIRED
Exam Name: Designing and Implementing Big Data Analytics Solutions
Exam Code: 70-475
Related Certification(s): Microsoft MCSE: Cloud Platform and Infrastructure Certification
Certification Provider: Microsoft
Number of 70-475 practice questions in our database: 122 (updated: 25-06-2019)
Expected 70-475 Exam Topics, as suggested by Microsoft :
  • Topic 1: Design and provision compute clusters Design for batch processing Design for data security Design and provision compute resources Design for real-time processing Orchestrate data processing activities in a data-driven workflow Design a deployme
Disscuss Microsoft 70-475 Topics, Questions or Ask Anything Related

Currently there are no comments in this discussion, be the first to comment!

Free Microsoft 70-475 Exam Actual Questions

Note: Premium Questions for 70-475 were last updated On 25-06-2019 (see below)

Question #1

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while the others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You plan to deploy a Microsoft Azure SQL data warehouse and a web application.

The data warehouse will ingest 5 TB of data from an on-premises Microsoft SQL Server database daily. The web application will query the data warehouse.

You need to design a solution to ingest data into the data warehouse.

Solution: You use AzCopy to transfer the data as text files from SQL Server to Azure Blob storage, and then you use PolyBase to run Transact-SQL statements that refresh the data warehouse database.

Does this meet the goal?

Reveal Solution Hide Solution
Correct Answer: A

If you need the best performance, then use PolyBase to import data into Azure SQL warehouse.

Note: Often the speed of migration is an overriding concern compared to ease of setup and maintainability, particularly when there's a large amount of data to move. Optimizing purely for speed, a source controlled differentiated approach relying on bcp to export data to files, efficiently moving the files to Azure Blob storage, and using the Polybase engine to import from blob storage works best.

References:

https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-migrate-data


Question #2

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while the others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a Microsoft Azure deployment that contains the following services:

Azure Data Lake

Azure Cosmos DB

Azure Data Factory

Azure SQL Database

You load several types of data to Azure Data Lake.

You need to load data from Azure SQL Database to Azure Data Lake.

Solution: You use a stored procedure.

Does this meet the goal?

Reveal Solution Hide Solution
Correct Answer: B

Question #3

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while the others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a Microsoft Azure deployment that contains the following services:

Azure Data Lake

Azure Cosmos DB

Azure Data Factory

Azure SQL Database

You load several types of data to Azure Data Lake.

You need to load data from Azure SQL Database to Azure Data Lake.

Solution: You use the AzCopy utility.

Does this meet the goal?

Reveal Solution Hide Solution
Correct Answer: B

Question #4

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while the others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a Microsoft Azure deployment that contains the following services:

Azure Data Lake

Azure Cosmos DB

Azure Data Factory

Azure SQL Database

You load several types of data to Azure Data Lake.

You need to load data from Azure SQL Database to Azure Data Lake.

Solution: You use the Azure Import/Export service.

Does this meet the goal?

Reveal Solution Hide Solution
Correct Answer: A

Question #5

Which technology should you recommend to meet the technical requirement for analyzing the social media data?

Reveal Solution Hide Solution
Correct Answer: A

Azure Stream Analytics is a fully managed event-processing engine that lets you set up real-time analytic computations on streaming data.

Scalability

Stream Analytics can handle up to 1 GB of incoming data per second. Integration with Azure Event Hubs and Azure IoT Hub allows jobs to ingest millions of events per second coming from connected devices, clickstreams, and log files, to name a few. Using the partition feature of event hubs, you can partition computations into logical steps, each with the ability to be further partitioned to increase scalability.



Unlock Premium 70-475 Exam Questions with Advanced Practice Test Features:
  • Select Question Types you want
  • Set your Desired Pass Percentage
  • Allocate Time (Hours : Minutes)
  • Create Multiple Practice tests with Limited Questions
  • Customer Support
Get Full Access Now

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77