This certification preparation material would help you in getting the jobs in the AWS Fields

AWS Developer Certification : Associate Level AWS Sysops Administrator Certification : Assciate Level AWS Solution Architect Certification : Associate Level AWS Soltion Architect : Professional Level AWS Certified Security Specialty (SCS-C01) AWS Professional certification Exam AWS Certified Big Data – Specialty (BDS-C00) AWS Certified Machine Learning MLS C01 Certification Prepration Materal AWS Solution Architect : Training Associate AWS Advanced Networking Certifications AWS Exam Prepare : Kinesis Data Stream Book : AWS Solution Architect Associate : Little Guide AWS Security Specialization Certification: Little Guide SCS-C01 AWS Package Deal


While applying to the Job you need to mention referred by : admin@hadoopexam.com | or Website : http://www.HadoopExam.com


 

Question-1: You are working in an NBFC company which provides the loan to the customer for 7 days to 20 year tenure, which include personal, commercial and home loan. This company is already having a website and mobile application. Now to provide better customer experience and some automated feature your web team had deployed conversational interface which is powered by AWS Lex. Using this user can request for new loan, pay EMI for existing loan, can ask for more funds etc. You have been asked to develop a solution to do near-real time analytics and view using charts to show the performance of the deployed Lex bots. Which of the following components would help to solve this requirement?

  1. You would be using Kinesis Data Stream to continuously stream conversation logs data from CloudWatch Logs to an S3 buckets. You will be using AWS Kinesis Data Analytics to transform raw data in JSON format and AWS Glue Crawler would be used to automatically discover the catalog metadata and QuickSight would be used to create dashboard.
  2. You would be using Kinesis Data Firehose to continuously stream conversation logs data from CloudWatch Logs to an S3 buckets. And this FireHose delivery system would be employing AWS Lambda function which can transform the raw data into JSON data records, and AWS Glue Crawler would be used to automatically discover the catalog metadata and QuickSight would be used to create dashboard, where Athena would be used as data source.
  3. You would be using Kinesis Stream to continuously stream chat data from the Chat application to an S3 buckets. And this FireHose delivery system would be employing Custom script which can transform the raw data into JSON data records, and QuickSight would be used to create dashboard, where Kinesis Data Analytics would be used as data source.
  4. You would be using Kinesis Firehose to continuously stream chat data from the Chat application to an S3 buckets. And this FireHose delivery system would be employing Custom script which can transform the raw data into JSON data records, and QuickSight would be used to create dashboard, where Kinesis Data Analytics would be used as data source.
  5. You would be using Kinesis Firehose to continuously stream chat data from the Chat application to a DynamoDB table. And this FireHose delivery system would be employing Custom script which can transform the raw data into JSON data records, and QuickSight would be used to create dashboard, where Kinesis Data Analytics would be used as data source.

 

Answer: B

Exp: Now we need to understand the feature available with the Amazon Lex, which provides the conversation logs for Amazon Lex and using that we can get the near-real time visibility into your Lex Bots to check the actual bot interactions. This Conversation logs all the bot interactions can be stored in the Amazon Cloudwatch Logs log group. And we need to use this conversation data to monitor the chat application and get the actionable insights for enhancing your bot to improve the user experience for your customers.

You can design the architecture as below to design business intelligence dashboard,

Using the Kinesis Data firehose continuously stream conversation log data from Cloudwatch log data to an S3 bucket. In the Firehose delivery system, we can employ a serverless AWS Lambda function to transform the raw data into JSON data records. After that use an AWS Glue crawler to automatically discover and catalogs metadata for this data, so that you can query it with the Amazon Athena. To create a dashboard in AWS QuickSight connect it with the Athena as a data source.