Question-50: You come into the problem of needing to produce a big number of log files as part of the data processing while you are in the process of analyzing the various website logs of your client in search of vulnerabilities. This is a problem since you are required to do so. You have been tasked with making the decision about the data storage system that will be used for the vast website portfolio owned by your organization. This information is obtained through a bespoke website analytics software at a typical pace of 6,000 clicks per minute, and it is then streamed in. To the tune of up to 8,500 clicks per second in short spurts. It must have been saved in order for your data science and user experience teams to do more research on it at a later date. Which storage infrastructure do you think would be best for you?
A. Google Cloud SQL
B. Google Cloud Bigtable
C. Google Cloud Storage
D. Google Cloud Datastore
Correct Answer
Get All 340 Questions and Answer for Google Professional Cloud Architect
: 2 Explanation: Cloud Bigtable satisfies all of the criteria outlined in the question thanks to its comprehensive feature set. Pick the large table if you're doing analysis. Google Cloud Bigtable because it provides the following: a) the low latency and high throughput workloads that are necessary in this use case to accommodate 8500 clicks events per second. Google Cloud Bigtable supports both of these. b) the OLAP use case, which involves integrating and analyzing this data with a variety of ML and data-science services. According to the question, this should be done for future research. The CBT storage option is a very expensive one, but it is necessary to save the data for future study. A fantastic location to store data would be in the cloud, and it could then be fed into BQ using cloud dataflow ETL, which would result in significant cost savings.