Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional-Cloud-Architect Topic 8 Question 72 Discussion

Actual exam question for Google's Google Cloud Architect Professional exam
Question #: 72
Topic #: 8
[All Google Cloud Architect Professional Questions]

You want to allow your operations learn to store togs from all the production protects in your Organization, without during logs from other projects All of the production projects are contained in a folder. You want to ensure that all logs for existing and new production projects are captured automatically. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

Kara
6 months ago
But don't we need to make sure the logs are only being captured from the production projects and not any other projects? Option C seems more targeted to that requirement.
upvoted 0 times
...
Lera
6 months ago
I think option B might be the way to go. Creating an aggregated export at the Organization level seems like the most efficient way to get all the logs in one place.
upvoted 0 times
...
Moira
6 months ago
Yeah, me neither. I'm trying to figure out the best way to capture all the logs from the production projects without duplicating the data.
upvoted 0 times
Ming
6 months ago
D) Create tog exports in the production projects. Set the tog sinks to be BigQuery datasets in the production projects and grant IAM access to the operations team to run queries on the datasets.
upvoted 0 times
...
Huey
6 months ago
B) Create an aggregated export on the Organization resource. Set the log sink to be a Cloud Storage bucket in an operations project.
upvoted 0 times
...
Shawna
6 months ago
A) Create an aggregated export on the Production folder. Set the log sink to be a Cloud Storage bucket in an operations project
upvoted 0 times
...
...
Bobbie
6 months ago
Haha, you guys are really overthinking this. Just throw everything into BigQuery and let the ops team figure it out, right?
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77