Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Architect Topic 1 Question 78 Discussion

Actual exam question for Google's Professional Cloud Architect exam
Question #: 78
Topic #: 1
[All Professional Cloud Architect Questions]

Your company has an application that is running on multiple instances of Compute Engine. It generates 1 TB per day of logs. For compliance reasons, the logs need to be kept for at least two years. The logs need to be available for active query for 30 days. After that, they just need to be retained for audit purposes. You want to implement a storage solution that is compliant, minimizes costs, and follows Google-recommended practices. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

The practice for managing logs generated on Compute Engine on Google Cloud is to install the Cloud Logging agent and send them to Cloud Logging.

The sent logs will be aggregated into a Cloud Logging sink and exported to Cloud Storage.

The reason for using Cloud Storage as the destination for the logs is that the requirement in question requires setting up a lifecycle based on the storage period.

In this case, the log will be used for active queries for 30 days after it is saved, but after that, it needs to be stored for a longer period of time for auditing purposes.

If the data is to be used for active queries, we can use BigQuery's Cloud Storage data query feature and move the data past 30 days to Coldline to build a cost-optimal solution.

Therefore, the correct answer is as follows

1. Install the Cloud Logging agent on all instances.

Create a sync that exports the logs to the region's Cloud Storage bucket.

3. Create an Object Lifecycle rule to move the files to the Coldline Cloud Storage bucket after one month. 4.

4. set up a bucket-level retention policy using bucket locking.'


Contribute your Thoughts:

Bette
6 months ago
Option D also seems like a good choice. Using a daily cron job to upload logs into a Cloud Storage bucket and then moving them to Coldline storage could work well.
upvoted 0 times
...
Beth
6 months ago
I think option A is better. Exporting logs into BigQuery for active queries and then setting a time_partitioning_expiration of 30 days seems efficient.
upvoted 0 times
...
Annalee
6 months ago
I think option B might be the best choice. Moving logs to Coldline storage after 30 days seems like a good way to minimize costs.
upvoted 0 times
...
Bette
6 months ago
I agree, it seems like they want us to choose the most cost-effective and compliant option.
upvoted 0 times
...
Annalee
6 months ago
I feel like this question is really testing our knowledge of Google Cloud storage solutions.
upvoted 0 times
...
Latrice
7 months ago
I see your point, Krissy. Having a retention policy at the bucket level for audit purposes is a smart move.
upvoted 0 times
...
Krissy
7 months ago
I prefer option B. Exporting logs into a regional Cloud Storage bucket and setting up Object Lifecycle rule seems more cost-effective.
upvoted 0 times
...
Mel
7 months ago
I agree with you, Glory. It's important to set a time_partitioning_expiration of 30 days for compliance.
upvoted 0 times
...
Glory
7 months ago
I think we should install the Cloud Ops agent on all instances and export logs into a BigQuery table for active query.
upvoted 0 times
...
Stephanie
8 months ago
I'm not sure about the daily cron job options. Relying on scripts running on the instances themselves seems a bit risky to me. The Cloud Ops agent and Cloud Storage integration in option B feel a lot more robust.
upvoted 0 times
Francisca
8 months ago
B) 4. Configure a retention policy at the bucket level to create a lock.
upvoted 0 times
...
Dyan
8 months ago
B) 3. Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
upvoted 0 times
...
Kendra
8 months ago
B) 2. Create a sink to export logs into a regional Cloud Storage bucket.
upvoted 0 times
...
Fairy
8 months ago
B) 1. Install the Cloud Ops agent on all instances.
upvoted 0 times
...
...
Emily
8 months ago
I agree, option B is the way to go. The Cloud Ops agent will make it easy to get the logs exported, and the Object Lifecycle rules will handle the long-term storage and cost optimization. Definitely the Google-recommended approach.
upvoted 0 times
...
Malinda
8 months ago
Option B does sound like the best choice. Exporting logs to a regional Cloud Storage bucket and then moving them to Coldline for long-term retention is a solid plan. The retention policy will ensure the logs are kept for the required two years.
upvoted 0 times
...
Jolene
8 months ago
Hmm, this is a tricky one. We need a solution that is compliant, cost-effective, and follows Google's recommended practices. I'm leaning towards option B - it seems to cover all the requirements.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77