Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Architect Topic 2 Question 97 Discussion

Actual exam question for Google's Professional Cloud Architect exam
Question #: 97
Topic #: 2
[All Professional Cloud Architect Questions]

Your company has an application that is running on multiple instances of Compute Engine. It generates 1 TB per day of logs. For compliance reasons, the logs need to be kept for at least two years. The logs need to be available for active query for 30 days. After that, they just need to be retained for audit purposes. You want to implement a storage solution that is compliant, minimizes costs, and follows Google-recommended practices. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

The practice for managing logs generated on Compute Engine on Google Cloud is to install the Cloud Logging agent and send them to Cloud Logging.

The sent logs will be aggregated into a Cloud Logging sink and exported to Cloud Storage.

The reason for using Cloud Storage as the destination for the logs is that the requirement in question requires setting up a lifecycle based on the storage period.

In this case, the log will be used for active queries for 30 days after it is saved, but after that, it needs to be stored for a longer period of time for auditing purposes.

If the data is to be used for active queries, we can use BigQuery's Cloud Storage data query feature and move the data past 30 days to Coldline to build a cost-optimal solution.

Therefore, the correct answer is as follows

1. Install the Cloud Logging agent on all instances.

Create a sync that exports the logs to the region's Cloud Storage bucket.

3. Create an Object Lifecycle rule to move the files to the Coldline Cloud Storage bucket after one month. 4.

4. set up a bucket-level retention policy using bucket locking.'


Contribute your Thoughts:

Jonell
16 days ago
I'm just hoping this exam doesn't turn into a 'log'-jam of questions. Get it? Log-jam? Oh, never mind...
upvoted 0 times
...
Ashlyn
22 days ago
Option B all the way! It's like playing a game of 'Log Tetris' - gotta keep those blocks (er, files) organized and cost-efficient.
upvoted 0 times
...
Romana
23 days ago
Option C is a no-go. Running a daily cron job on all instances to upload logs? That's just asking for trouble. Let's stick to the Google-recommended practices.
upvoted 0 times
Lashaun
21 hours ago
4. Configure a retention policy at the bucket level to create a lock.
upvoted 0 times
...
Caren
9 days ago
3. Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
upvoted 0 times
...
Effie
16 days ago
2. Create a sink to export logs into a regional Cloud Storage bucket.
upvoted 0 times
...
Twana
17 days ago
B) 1. Install the Cloud Ops agent on all instances.
upvoted 0 times
...
...
Caprice
1 months ago
Option D seems a bit convoluted. Why not just use the Cloud Ops agent to export logs directly to a Cloud Storage bucket? That's the easiest way to go.
upvoted 0 times
Carli
4 days ago
That's a good point. Option B seems to cover all the requirements for compliance and cost optimization.
upvoted 0 times
...
Rosenda
10 days ago
But wouldn't it be better to have a retention policy in place to ensure compliance and cost-effectiveness?
upvoted 0 times
...
Ryan
17 days ago
I agree, using the Cloud Ops agent for direct export sounds simpler and more efficient.
upvoted 0 times
...
Jillian
25 days ago
Option D seems a bit convoluted. Why not just use the Cloud Ops agent to export logs directly to a Cloud Storage bucket? That's the easiest way to go.
upvoted 0 times
...
...
Gail
2 months ago
I agree with you, B seems to be the most cost-effective solution.
upvoted 0 times
...
Carman
2 months ago
I'm leaning towards Option A. Exporting logs to a partitioned BigQuery table seems like a simpler and more efficient solution than dealing with multiple Cloud Storage buckets.
upvoted 0 times
...
Ressie
2 months ago
I think the best option is B.
upvoted 0 times
...
Starr
2 months ago
Option B looks like the best choice. Exporting logs to a regional Cloud Storage bucket and then moving them to Coldline after a month is a good way to balance accessibility and cost-effectiveness.
upvoted 0 times
Rolf
4 days ago
Setting a retention policy at the bucket level adds an extra layer of security for audit purposes.
upvoted 0 times
...
Margarett
5 days ago
Creating an Object Lifecycle rule to move files to Coldline after a month is a smart way to save on storage costs.
upvoted 0 times
...
Kayleigh
6 days ago
Installing the Cloud Ops agent on all instances is a crucial step in ensuring all logs are captured.
upvoted 0 times
...
Mari
9 days ago
I agree, option B seems like the most cost-effective solution while still meeting compliance requirements.
upvoted 0 times
...
Marguerita
19 days ago
Yes, using Object Lifecycle rules to automatically move files to Coldline Cloud Storage after a month is a smart way to manage costs while still meeting compliance requirements.
upvoted 0 times
...
Kimbery
20 days ago
I agree, it's important to have the logs accessible for active query for 30 days but then move them to a more cost-effective storage solution for audit purposes.
upvoted 0 times
...
Luis
1 months ago
Option B looks like the best choice. Exporting logs to a regional Cloud Storage bucket and then moving them to Coldline after a month is a good way to balance accessibility and cost-effectiveness.
upvoted 0 times
...
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77