Question-59:Your business utilises multiple instances of Google Compute Engine to host an application that is currently being run. It creates one terabyte (TB) of logs every single day. For reasons related to regulatory compliance, the logs have to be kept for a minimum of two years. For a period of thirty days, the logs must be accessible for active querying. After that, all that is required is that they be kept for the purposes of auditing. You are looking to implement a storage solution that is compliant, keeps costs to a minimum, and adheres to the best practises recommended by Google. What is it that you ought to do?
A. 1. Make sure each instance has a Cloud Logging agent installed. 2. Establish a source that will export logs into a regional storage bucket in the cloud. 3. After a month has passed, you should move files into a Coldline Cloud Storage bucket by creating a rule in the Object Lifecycle system. 4. Using bucket lock, configure a retention policy that will apply to the bucket itself.
B. 1. Create a cron job that will run once per day on all instances and will upload logs into a cloud storage bucket. 2. Establish a source that will export logs into a regional storage bucket in the cloud. 3. After a month has passed, you should migrate files into a Coldline Cloud Storage bucket by creating a rule in the Object Lifecycle system.
C. 1. Ensure that each device has a Cloud Logging agent installed.
D. 1. Establish a daily cron job that will be executed on all instances and will be responsible for uploading logs into a partitioned BigQuery table. 2. Determine a time partitioning expiration that is thirty days in the future.
Correct Answer

Get All 340 Questions and Answer for Google Professional Cloud Architect

: 1 Explanation: The practice for managing logs generated on Compute Engine on Google Cloud is to install the Cloud Logging agent and send them to Cloud Logging. The sent logs will be aggregated into a Cloud Logging sink and exported to Cloud Storage. The reason for using Cloud Storage as the destination for the logs is that the requirement in question requires setting up a lifecycle based on the storage period. In this case, the log will be used for active queries for 30 days after it is saved, but after that, it needs to be stored for a longer period of time for auditing purposes. If the data is to be used for active queries, we can use BigQuery's Cloud Storage data query feature and move the data past 30 days to Coldline to build a cost-optimal solution. Therefore, the correct answer is as follows 1. Install the Cloud Logging agent on all instances. Create a sync that exports the logs to the region's Cloud Storage bucket. 3. Create an Object Lifecycle rule to move the files to the Coldline Cloud Storage bucket after one month. 4. 4. set up a bucket-level retention policy using bucket locking.