Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional-Cloud-Architect Topic 11 Question 84 Discussion

Actual exam question for Google's Google Cloud Architect Professional exam
Question #: 84
Topic #: 11
[All Google Cloud Architect Professional Questions]

Your company wants to migrate their 10-TB on-premises database export into Cloud Storage You want to minimize the time it takes to complete this activity, the overall cost and database load The bandwidth between the on-premises environment and Google Cloud is 1 Gbps You want to follow Google-recommended practices What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: A

The Data Transfer appliance is a Google-provided hardware device that can be used to transfer large amounts of data from on-premises environments to Cloud Storage. It is suitable for scenarios where the bandwidth between the on-premises environment and Google Cloud is low or insufficient, and the data size is large. The Data Transfer appliance can minimize the time it takes to complete the migration, the overall cost and database load, by avoiding network bottlenecks and reducing bandwidth consumption. The Data Transfer appliance also encrypts the data at rest and in transit, ensuring data security and privacy. The other options are not optimal for this scenario, because they either require a high-bandwidth network connection (B, C, D), or incur additional costs and complexity (B, C). Reference:

https://cloud.google.com/data-transfer-appliance/docs/overview

https://cloud.google.com/blog/products/storage-data-transfer/introducing-storage-transfer-service-for-on-premises-data


Contribute your Thoughts:

Vicente
3 months ago
I'm with Cheryl on this one. Dataflow is the way to go. It's fast, efficient, and you can't beat Google's recommended practices. Although, I do wonder if they'll give us a free Data Transfer appliance as a reward for choosing the 'right' answer.
upvoted 0 times
...
Wilbert
3 months ago
Haha, Claribel, Dataflow may be a bit more complex, but it's probably worth it for a 10-TB migration. Plus, who doesn't love some good old-fashioned 'cloud magic'?
upvoted 0 times
Kerrie
2 months ago
C) Develop a Dataflow job to read data directly from the database and write it into Cloud Storage
upvoted 0 times
...
Bettye
3 months ago
A) Use the Data Transfer appliance to perform an offline migration
upvoted 0 times
...
Carey
3 months ago
C) Develop a Dataflow job to read data directly from the database and write it into Cloud Storage
upvoted 0 times
...
In
3 months ago
A) Use the Data Transfer appliance to perform an offline migration
upvoted 0 times
...
...
Claribel
4 months ago
Hmm, I'm not sure about Dataflow. Isn't that thing a bit complicated? I'm leaning more towards Option D - compressing the data and using gsutil seems straightforward and should get the job done.
upvoted 0 times
Glory
3 months ago
User 3: Yeah, compressing the data and using gsutil sounds like a good plan.
upvoted 0 times
...
Adelina
3 months ago
User 2: I agree, Option D seems more straightforward.
upvoted 0 times
...
Whitley
3 months ago
User 1: I think Dataflow might be a bit complicated.
upvoted 0 times
...
...
Cheryl
4 months ago
I agree, Dataflow seems like the way to go. It's a Google-recommended practice and will probably be the most cost-effective solution for a 10-TB migration.
upvoted 0 times
Lauryn
3 months ago
I agree, Dataflow seems like the way to go. It's a Google-recommended practice and will probably be the most cost-effective solution for a 10-TB migration.
upvoted 0 times
...
Gearldine
3 months ago
C) Develop a Dataflow job to read data directly from the database and write it into Cloud Storage
upvoted 0 times
...
Reita
4 months ago
A) Use the Data Transfer appliance to perform an offline migration
upvoted 0 times
...
...
Ernestine
4 months ago
I agree with Chantell, it will help minimize the time and cost.
upvoted 0 times
...
Minna
4 months ago
Option C looks like the best choice here. A Dataflow job can directly read from the database and write to Cloud Storage, which should be faster and more efficient than the other options.
upvoted 0 times
Aliza
3 months ago
Yes, using a Dataflow job to read data directly from the database and write it into Cloud Storage should help minimize the time and cost of the migration.
upvoted 0 times
...
Luz
3 months ago
I agree, Option C seems like the most efficient choice for this scenario.
upvoted 0 times
...
Lachelle
3 months ago
Yes, using a Dataflow job to directly read and write data to Cloud Storage would definitely help minimize the time and cost of the migration.
upvoted 0 times
...
Ressie
4 months ago
I agree, Option C seems like the most efficient choice for this scenario.
upvoted 0 times
...
Glynda
4 months ago
Yes, using a Dataflow job to read data directly from the database and write it into Cloud Storage would definitely help minimize the time and cost.
upvoted 0 times
...
Jamal
4 months ago
I agree, Option C seems like the most efficient choice for this migration.
upvoted 0 times
...
...
Chantell
4 months ago
I think we should use the Data Transfer appliance for offline migration.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77