Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-203 Topic 1 Question 103 Discussion

Actual exam question for Microsoft's DP-203 exam
Question #: 103
Topic #: 1
[All DP-203 Questions]

You have an Azure subscription that contains an Azure Data Factory data pipeline named Pipeline1, a Log Analytics workspace named LA1, and a storage account named account1.

You need to retain pipeline-run data for 90 days. The solution must meet the following requirements:

* The pipeline-run data must be removed automatically after 90 days.

* Ongoing costs must be minimized.

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

Show Suggested Answer Hide Answer
Suggested Answer: A, B

Contribute your Thoughts:

Cheryll
2 months ago
Wait, we're supposed to *minimize* costs? In that case, I'm going with C and D. Let the storage account handle the logs and keep those costs down, my friends!
upvoted 0 times
Haydee
18 days ago
Definitely. Plus, setting the data retention period to 90 days in the Log Analytics workspace will help manage costs.
upvoted 0 times
...
Levi
19 days ago
Yeah, that makes sense. Sending logs to the storage account is a cost-effective solution.
upvoted 0 times
...
Cyril
24 days ago
I think C and D are the way to go. Let the storage account handle the logs and keep costs down.
upvoted 0 times
...
...
Ellsworth
2 months ago
Ha! Easy peasy, just gotta remember that Azure loves its storage accounts. I'd say B and D are the way to go. Minimal cost, maximum efficiency!
upvoted 0 times
...
Quentin
2 months ago
Ah, this one's tricky! I'd go with C and D. Sending the logs directly to the storage account and setting the retention in Log Analytics seems like the best way to meet the requirements.
upvoted 0 times
Dell
1 months ago
C and D seem to be the correct actions to take. Storing the logs in the storage account and setting the retention period in Log Analytics is the way to go.
upvoted 0 times
...
Odette
2 months ago
I agree, C and D make the most sense. It's important to have the logs stored in the storage account and set the retention period in Log Analytics.
upvoted 0 times
...
Vanna
2 months ago
I think C and D are the right choices. That way the logs are stored in the storage account and the retention period is set in Log Analytics.
upvoted 0 times
...
...
Jani
2 months ago
But shouldn't we also set the retention period of account1 to 90 days from the Diagnostic settings?
upvoted 0 times
...
Lelia
2 months ago
I agree with Dominga. That way we can retain the pipeline-run data for 90 days.
upvoted 0 times
...
Roosevelt
3 months ago
I think the answer is B and D. Sending the logs to the storage account and setting the retention period in Log Analytics seems like the way to go to minimize ongoing costs.
upvoted 0 times
Lachelle
2 months ago
C) Configure Pipeline1 to send logs to account1.
upvoted 0 times
...
Brett
2 months ago
B) From the Diagnostic settings (classic) settings of account1. set the retention period to 90 days.
upvoted 0 times
...
Adelle
2 months ago
A) Configure Pipeline1 to send logs to LA1.
upvoted 0 times
...
...
Dominga
3 months ago
I think we should configure Pipeline1 to send logs to LA1.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77