Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional-Cloud-Database-Engineer Topic 4 Question 29 Discussion

Actual exam question for Google's Professional Cloud Database Engineer exam
Question #: 29
Topic #: 4
[All Professional Cloud Database Engineer Questions]

You are building a data warehouse on BigQuery. Sources of data include several MySQL databases located on-premises.

You need to transfer data from these databases into BigQuery for analytics. You want to use a managed solution that has low latency and is easy to set up. What should you do?

A. Create extracts from your on-premises databases periodically, and push these extracts to Cloud Storage. Upload the changes into BigQuery, and merge them with existing tables.

Show Suggested Answer Hide Answer
Suggested Answer: C

Contribute your Thoughts:

Ricki
4 months ago
But wouldn't using Cloud Data Fusion in option B provide a more straightforward way to extract and load data into BigQuery?
upvoted 0 times
...
Annmarie
4 months ago
I'm leaning towards option D. Replicating data to Cloud SQL for MySQL and creating federated tables in BigQuery sounds like a robust solution.
upvoted 0 times
...
Mary
4 months ago
I prefer option C. Using Datastream to create a stream and then processing data with Dataflow seems like a solid approach.
upvoted 0 times
...
Ricki
4 months ago
I think option A might be a good choice. Creating extracts and pushing them to Cloud Storage sounds efficient.
upvoted 0 times
...
Herminia
4 months ago
I see the appeal of option D as well. Using Database Migration Service can simplify the process.
upvoted 0 times
...
Omega
5 months ago
Interesting point, I can see the benefits of real-time data transfer.
upvoted 0 times
...
Kaycee
5 months ago
I personally would go with option C. It provides real-time data transfer and processing.
upvoted 0 times
...
Herminia
5 months ago
I disagree, option B using Cloud Data Fusion seems more robust and scalable.
upvoted 0 times
...
Omega
5 months ago
I think option A is a good choice. It's simple and efficient.
upvoted 0 times
Marguerita
4 months ago
I prefer option D because it involves replication and federated tables for easier transformation.
upvoted 0 times
...
Regenia
4 months ago
I see the benefit of using Datastream and Dataflow in option C for real-time data processing.
upvoted 0 times
...
Arthur
4 months ago
I think option B would be a better choice as it involves transforming the data into the appropriate schema.
upvoted 0 times
...
Alease
5 months ago
I agree, using extracts and Cloud Storage seems like a straightforward approach.
upvoted 0 times
...
...
Darrel
6 months ago
Option B with Cloud Data Fusion also sounds promising. The ability to transform the data as part of the loading process could be really useful. Plus, it's a managed service, so that could make it easier to set up.
upvoted 0 times
...
Aleisha
6 months ago
Hmm, I don't know. Replicating the data to Cloud SQL and then using federated tables in BigQuery sounds a bit convoluted to me. Plus, I'm not a big fan of vendor-specific solutions.
upvoted 0 times
...
Veronika
6 months ago
Hmm, I'm leaning towards option C with Datastream and Dataflow. That seems like a good way to get the data from the on-premises MySQL databases into BigQuery with low latency and minimal setup.
upvoted 0 times
...
Hassie
6 months ago
No kidding! This is like a job interview disguised as an exam question. But I'm kind of enjoying the challenge, to be honest.
upvoted 0 times
...
Malcom
6 months ago
Hey, that's a good point. Option C with Datastream and Dataflow might be a better bet. It seems a bit more flexible, and you can probably tweak the transformation process.
upvoted 0 times
...
Javier
6 months ago
Option B does seem like the easiest to set up, but I'm a little worried about the data transformation part. What if the schema isn't a perfect match?
upvoted 0 times
...
Rikki
6 months ago
Haha, you guys are really digging into this. I feel like we should be getting paid for this level of analysis!
upvoted 0 times
...
Audra
6 months ago
Haha, good luck with that! These days, everything has to be a managed solution with low latency and easy setup. I'm leaning towards option B - Cloud Data Fusion sounds pretty slick.
upvoted 0 times
...
Kate
6 months ago
Yeah, I hear you. I was hoping for something a little more straightforward, you know? Maybe we could just export the data to a CSV and upload it to BigQuery.
upvoted 0 times
Millie
6 months ago
Yeah, I hear you. I was hoping for something a little more straightforward, you know? Maybe we could just export the data to a CSV and upload it to BigQuery.
upvoted 0 times
...
Sherell
6 months ago
C. Use Datastream to connect to your on-premises database and create a stream. Have Datastream write to Cloud Storage. Then use Dataflow to process the data into BigQuery.
upvoted 0 times
...
Eric
6 months ago
B. Use Cloud Data Fusion and scheduled workflows to extract data from MySQL. Transform this data into the appropriate schema, and load this data into your BigQuery database.
upvoted 0 times
...
Vicente
6 months ago
A. Create extracts from your on-premises databases periodically, and push these extracts to Cloud Storage. Upload the changes into BigQuery, and merge them with existing tables.
upvoted 0 times
...
...
Alana
6 months ago
Whoa, this is a tough one! I'm not sure I'm comfortable with any of these options. They all seem a bit complex for my liking.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77