New Year Sale ! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-300 Topic 1 Question 44 Discussion

Actual exam question for Microsoft's DP-300 exam
Question #: 44
Topic #: 1
[All DP-300 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.

Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes mapping data flow, and then inserts the data into the data warehouse.

Does this meet the goal?

Show Suggested Answer Hide Answer
Suggested Answer: B

If you need to transform data in a way that is not supported by Data Factory, you can create a custom activity,

not a mapping flow,5 with your own data processing logic and use the activity in the pipeline. You can create a

custom activity to run R scripts on your HDInsight cluster with R installed.


https://docs.microsoft.com/en-US/azure/data-factory/transform-data

Contribute your Thoughts:

Currently there are no comments in this discussion, be the first to comment!


Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77