Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-203 Topic 1 Question 94 Discussion

Actual exam question for Microsoft's DP-203 exam
Question #: 94
Topic #: 1
[All DP-203 Questions]

You have two Azure Blob Storage accounts named account1 and account2?

You plan to create an Azure Data Factory pipeline that will use scheduled intervals to replicate newly created or modified blobs from account1 to account?

You need to recommend a solution to implement the pipeline. The solution must meet the following requirements:

* Ensure that the pipeline only copies blobs that were created of modified since the most recent replication event.

* Minimize the effort to create the pipeline.

What should you recommend?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

Juliana
4 months ago
You know, if we had three Azure Blob Storage accounts, this would be a real party. But for now, I'm going with Option C - sounds like the path of least resistance.
upvoted 0 times
Dorthy
3 months ago
I agree, Option C seems like the most efficient choice for this scenario.
upvoted 0 times
...
Serina
3 months ago
Option C sounds good to me too. It seems like the easiest way to meet the requirements.
upvoted 0 times
...
...
Ocie
4 months ago
I'm picturing the pipeline engineer furiously slamming their keyboard, trying to get the flowlet to work. Option C for the win!
upvoted 0 times
Queenie
4 months ago
Definitely, let's go with option C to minimize the effort and ensure we only copy the necessary blobs.
upvoted 0 times
...
Karon
4 months ago
Yeah, using the Copy Data tool with the Metadata-driven copy task would definitely make the pipeline creation easier.
upvoted 0 times
...
Daron
4 months ago
I agree, option C seems like the best choice for this scenario.
upvoted 0 times
...
...
Lai
4 months ago
Ha! I bet the built-in copy task is the easiest way to set this up. No need to overthink it, just let Azure Data Factory do the heavy lifting.
upvoted 0 times
Temeka
3 months ago
Let's go with the built-in copy task then.
upvoted 0 times
...
Lawana
3 months ago
I've used it before and it's really easy to set up.
upvoted 0 times
...
Jenifer
3 months ago
Yeah, I think it's the way to go for this scenario.
upvoted 0 times
...
Chantay
4 months ago
I agree, the built-in copy task seems like the simplest option.
upvoted 0 times
...
...
Stephaine
5 months ago
Option D uses a built-in copy task which will minimize the effort to create the pipeline.
upvoted 0 times
...
Tennie
5 months ago
Why do you think option D is better?
upvoted 0 times
...
Germaine
5 months ago
Hmm, I'm not sure about that. Doesn't the Data Flow activity give you more flexibility to customize the pipeline? It might be overkill for this use case, though.
upvoted 0 times
Wilda
4 months ago
Yeah, I agree. Maybe the Metadata-driven copy task would be a better fit.
upvoted 0 times
...
Brett
4 months ago
I think the Data Flow activity might be too much for this.
upvoted 0 times
...
...
Tomas
5 months ago
Option C looks like the way to go. Metadata-driven copy task sounds like it'll handle the incremental updates nicely.
upvoted 0 times
Nadine
5 months ago
I think we should go with option C for sure. It meets all the requirements we need for the pipeline.
upvoted 0 times
...
Shad
5 months ago
It definitely seems like the best option to minimize the effort needed to create the pipeline.
upvoted 0 times
...
Lennie
5 months ago
I agree, using the Copy Data tool with the Metadata-driven copy task seems like the most efficient solution.
upvoted 0 times
...
Micah
5 months ago
Option C looks like the way to go. Metadata-driven copy task sounds like it'll handle the incremental updates nicely.
upvoted 0 times
...
...
Stephaine
5 months ago
I disagree, I believe option D is the better choice.
upvoted 0 times
...
Tennie
5 months ago
I think we should go with option C.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77