Question-10: You have data collected in an S3 bucket, and this data you want to load in one of the AWS RDS instances. But the data collected in S3 bucket is not well formatted and hence before inserting this data into RDS instance you want to convert in a particular format so that it can be easily loaded in the RDS instance as part of ETL job. There is a one custom function written using Lambda to check the size of the collected data, as soon as it reaches 10 GB then ETL job should be initiated to load the data in the RDS instance. How can you achieve this requirement?
- You would be using EMR
- You would be using AWS Lambda
- Get the Latest AWS Certification Questions & Answer based on recent syllabus from this link
- You would be using Apache Spark and EMR
Answer: C
Exp: As in the question it is given Get the latest AWS Training, Certification Preparation Material, Books & Interview questions that you wanted to load the data in the RDS instance and before loading you want to clean up the data and pre-process the same to convert in required format so that it can be easily loaded in RDS instance. So this entire requirement is an ETL job, which can be easily implemented using the AWS Glue, which helps in creating ETL jobs and to trigger the ETL jobs from AWS Lambda function where we would be implementing the custom logic for checking the size of the data regularly and as soon as it reaches to 10 GB, we can trigger the AWS Glue job.