Question-5: You are working in an eCommerce company, who wanted to analyze the customer behavior in real time, your data scientists have given the format in which they are looking for the data, so that they can execute the SQL query on the data. To covert click stream data in specific format you have already written some custom code, which of the following is a good solution for implementing this requirement?
- You would be using Kinesis Data Firehose and Redshift Cluster as destination
- You would be using Kinesis Data Stream and DynamoDB as a destination and in between you would be using your custom logic to transform the data in required format.
- You would be using Kinesis Data Stream and Amazon RDS as a destination and in between you would be using your custom logic to transform the data in required format.
- You would be using Kinesis Data Stream and Kinesis Data Analytics and in between you would be using AWS Lambda which would have your custom logic in between to transform your data in required format.
Exp: As we need to collect Get the latest AWS Training, Certification Preparation Material, Books & Interview questions the data in real-time, hence we can use either Kinesis Data Stream or Kinesis Firehose. However, we need to apply custom logic, so we can use the AWS Lambda to submit that custom logic and transform the data before submitting to the Kinesis Data Analytics. Because using the Kinesis Data Analytics data scientists can apply the SQL queries on the received data in real time. And the result of SQL queries can be directly saved in designated storage like S3