Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Database Engineer Topic 3 Question 34 Discussion

Actual exam question for Google's Professional Cloud Database Engineer exam
Question #: 34
Topic #: 3
[All Professional Cloud Database Engineer Questions]

You are building a data warehouse on BigQuery. Sources of data include several MySQL databases located on-premises.

You need to transfer data from these databases into BigQuery for analytics. You want to use a managed solution that has low latency and is easy to set up. What should you do?

A. Create extracts from your on-premises databases periodically, and push these extracts to Cloud Storage. Upload the changes into BigQuery, and merge them with existing tables.

Show Suggested Answer Hide Answer
Suggested Answer: C

Contribute your Thoughts:

Ressie
5 months ago
Hmm, I wonder if the Dataflow step in option C is really necessary. Can't Datastream just write directly to BigQuery? I'm feeling a bit lazy today.
upvoted 0 times
...
Georgiana
5 months ago
Option A is a bit manual, but it might be the most cost-effective choice if I don't mind a bit of latency. Can't beat the simplicity of just uploading files to Cloud Storage.
upvoted 0 times
Salena
5 months ago
Yeah, A seems like a straightforward solution for transferring data.
upvoted 0 times
...
Beckie
5 months ago
A sounds like a good option if you're looking to save some money.
upvoted 0 times
...
...
Jaime
6 months ago
Database Migration Service is an interesting option, but I'm not sure I want to add an extra Cloud SQL instance into the mix. Seems a bit overkill for my use case.
upvoted 0 times
...
Colette
6 months ago
I'm a fan of Datastream. The ability to create a real-time stream from my on-premises database and load it into BigQuery is very appealing.
upvoted 0 times
Gladis
5 months ago
Yeah, it sounds like a convenient way to transfer data into BigQuery for analytics.
upvoted 0 times
...
Ashlyn
5 months ago
I think using Datastream to create a real-time stream from on-premises database is a great idea.
upvoted 0 times
...
...
Armando
6 months ago
Option B looks like the easiest and most straightforward solution. Cloud Data Fusion seems to handle the entire process for me.
upvoted 0 times
Willetta
5 months ago
I think I'll go with option B as well. It seems like the most efficient way to transfer data from MySQL to BigQuery.
upvoted 0 times
...
Donette
5 months ago
I agree, using Cloud Data Fusion and scheduled workflows would simplify the data transfer process.
upvoted 0 times
...
Halina
5 months ago
Option B looks like the easiest and most straightforward solution. Cloud Data Fusion seems to handle the entire process for me.
upvoted 0 times
...
...
Sherita
6 months ago
I agree with Doretha. Using scheduled workflows will make the process easier and more automated.
upvoted 0 times
...
Doretha
6 months ago
I think option B is the best choice. Cloud Data Fusion can help us extract and transform data efficiently.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77