This certification preparation material would help you in getting the jobs in the AWS Fields

AWS Developer Certification : Associate Level AWS Sysops Administrator Certification : Assciate Level AWS Solution Architect Certification : Associate Level AWS Soltion Architect : Professional Level AWS Certified Security Specialty (SCS-C01) AWS Professional certification Exam AWS Certified Big Data – Specialty (BDS-C00) AWS Certified Machine Learning MLS C01 Certification Prepration Materal AWS Solution Architect : Training Associate AWS Advanced Networking Certifications AWS Exam Prepare : Kinesis Data Stream Book : AWS Solution Architect Associate : Little Guide AWS Security Specialization Certification: Little Guide SCS-C01 AWS Package Deal


While applying to the Job you need to mention referred by : admin@hadoopexam.com | or Website : http://www.HadoopExam.com


 

Get All the Questions Covering Entire Syllabus from here  (2018-2019) : This material is owned by HadoopExam.com . Please dont copy its bad Karma


Question 19: You are working with a giant online retail com.................................. as well as application logs in real-time so they can apply the machine learning in real-time as well as they need this data to be saved in S3 bucket. Which of the following solution is suitable for this requirement?

  1. Correct Answer
  2. You will create two script one for infrastructure log and other one for application log and using TCP on separate port you will provide this data. Hence, your client application will make connection on these ports to read the data in real-time.
  3. Correct Answer
  4. You will be writing a script which has Map-reduce code and tag the application log and infrastructure separately. And then merge both the logs and send it over SQS. Then your consumer application can read those logs on SQS and segregate based on tags.

Correct Answer : A, C

Detailed Explaination: In Question it is very simple that they want both the logs Infrastructure IT and application logs at one place. Both the logs are available the only issue here is how optimally you can send them in real time so that consumer application can use it in real-time for applying Machine Learning.

If you see real-time data retrieval then you can start thinking of streaming solution and which is Kinesis data stream. If you find option related to that then it will be an answer. So here option-4 is correct answer. However, we need to choose two option.

Option-1: .......

Option-2: It’s talking about writing custom solution and then use TCP socket to read data in real-time.  Why we need to do such development, if AWS is providing easy solution for common requirement. Hence, you cannot consider it as a correct answer.

Option-3: ....

Option-4: You can write script which can have implementation of MapReduce algorithm. We don’t think it is required to tag the logs. Yes, you can tag both the logs separately. You merger them at originator site and then create a message out of them and send it to SQL and client side you will read that logs and based on the tag you will separate the log and then consumer application will read it. Solution is possible, but it is not a right answer for the perspective of better design, it has lot of complexity both side.

Kinesis Data Stream:

  1. Kinesis can help you collect data as stream, also it can be real-time.
  2. For that we have to create a Kinesis data processing application, and it will read data from a data stream as data records.
  3. To create Data Processing application, you need a Kinesis client library, and you can run this application on EC2 instances.
  4. Kinesis Data Stream is a part of the Kinesis Streaming Data Platform, which has other products like Kinesis Data Firehose, Kinesis Video Streams, and Kinesis Data Analytis.
  5. For example you can collect data like IT infrastructure logs, application logs, social media logs, market data feeds, web clickstream etc.