HadoopExam Learning Resources


BigData | DataScience | IOT | Cloud | DevOps | ITRisk | AI |BlockChain 



    35000+ Learners upgraded/switched career    Testimonials

All Certifications preparation material is for renowned vendors like Cloudera, MapR, EMC, Databricks,SAS, Datastax, Oracle, NetApp etc , which has more value, reliability and consideration in industry other than any training institutional certifications.
Note : You can choose more than one product to have custome package created from below and send email to hadoopexam@gmail.com to get discount.

Do you know?

HDPCA : HDP Certification for Hadoop Admiin (To deploy and Manage Cluster) : HDPCA  Certification exam to test HDP administration skills om Hortonworks® data platform , this is an Hands On exam and problem needs to be solved on multi node cluster. To give similar experience and similar problems , HadoopExam had launched a practice question bank containing in total 57 solved problem scenarios. Also, complimentary video will be provided to setup 4 node c;uster using VMWare Workstation (30 Days trial version) . Also we will be providing videos for solving problem scenario wherever required (Only selected problems video tutorial will be provided. Any future enhancements on this product will be provided without any additional fee, if this is same machine. Refer faq for more detail.

Hadoop Annual Subscription

HDPCA : HDP Admin Certification 

Download Trial Version

Contact Us After Buying To Download or Get Full Version  

Phone : 022-42669636
Mobile : +91-8879712614

Regular Price: $240.00

Discounted $59 (Limited  time only) 

Note: If having trouble while credit
card payment then please create PayPal account and then pay.

GST : India Govt Goods and Service Tax
India Bank Transfer
Regular Price: 9999 INR
Offer  Price : 2999INR
Click Below ICICI Bank Acct. Detail
 Indian credit and Debit Card(PayuMoney)

All popular products for Hadoop eco-system are combined and created packaged solution, used by learners with : Limited Time offer  : PACK11HDPSPRKHRTNEXM7777

This entire package will prepare you for Hadoop as well as Spark  and make you BigData experts in few days.
 (It will take approx. 100Hrs to complete all these material)
  1. HDPCD (No Java) Hadoop Developer Certifications 
  2. HDPCD Spark Developer Certification
  3. HDPCA Hortonworks Hadoop Admin Certification
  4. Hadoop Professioanl Training with HandsOn Session 
  5. Spark Professioanl  Training. with HadnsOn Session 
  6. Spark SQL 2.x Professional Training
  7. HBase Professional Training
  8. OOzie Professional Training
  9. Spark SQL 2.x Fundamenrtals and Cookbook (Ebook Access)
  10. Scala Professional Training
  11. Python Professional Training
To Check all available package offers check here and to customize as per
your need contact hadoopexam@gmail.com / admin@hadoopexam.com
Regular Price: $1066.00
Offer Price (Save Flat 50% +20% off ) : $249
Note: If having trouble while credit
card payment then please create PayPal account and then pay.
India Bank Transfer
Regular Price:39000 INR
Offer Price  (Save Flat 50% +20% off ) : 14999

Click Below ICICI Bank Acct. Detail
 Indian credit and Debit Card(PayuMoney)

Required Skills for HDPCA : HDP Admin Certification


  • Configure a local HDP repository : It is require, you create local repository of all the HDP software on one of the node in cluster. Once repository setup is done, you have to use it to insall all the HDP software in your 4 node cluster.
  • Install ambari-server and ambari-agent : This involves two steps
  •     Install Ambari Server :  it needs to be done only on one of the node in cluster (You can choose, one of the master node)
  •     Install Ambari Agent  : Ambari agent should be installed on all the nodes in cluster. Hence, each node can send data to Ambari Server and can     be displayed on Ambari WebUI.
  • Install HDP using the Ambari install wizard : Now create 4 node cluster using Ambari UI. In this you have to below steps
  •     Choose your cluster name
  •     Choose master and slave nodes
  •     Configure Local Repository
  •     Register your node in cluster
  •     Install Hadoop on all 4 nodes
  • Add a new node to an existing cluster : You will be given a node, which is not yet attached to your cluster. It also gives private ssh keys, and you need to add in your existing 4 node cluster.
  • Decommission a node : It does not mean to delete node. It means you remove node from the cluster and clean it up (remove all Hadoop DataNode configuration)
  • Add an HDP service to a cluster using Ambari : There more than 15 services , which can be added to HDP cluster. You need to learn how to add each individual or all-together this services.

  • Define and deploy a rack topology script : You shuld be able to create Rack topology and deploy the same. Hence, accordingly datanodes can be arranged and whatevcer data you store in cluster will be arranged as per the defined rack topology.
  • Change the configuration of a service using Ambari : As you have more than 15 services in cluster. You need to be able to do basic configuration for those services.
  • Configure the Capacity Scheduler : This is one of the schedular for scheduling the submitted jobs in cluster. Being an admin you shoul be able to allocate proper resources to each cluster user. Hence, proper configuration is expected.
  • Create a home directory for a user and configure permissions : Create a directory for each unix user in HDFS. Also give respective permissions like who can delete and update data in this HDFS directory.
  • Configure the include and exclude DataNode files :
  • Troubleshooting : If there are any issue in cluster , you shoul be able to find and fix them. More detail will be given in practice questions.
  • Restart an HDP service : Once you have added the services in cluster. You change their configuration and needs to restart the same.
  • View an application’s log file : For each service , you should find the log path and see if there is any issue.
  • Configure and manage alerts : You should be abkle to create new alert configuration.
  • Troubleshoot a failed job : If a user had submitted a Job and it got failed. You should be able to find. Why this Job has failed.

High Availability
  • Configure NameNode HA : Configure namenode High Availaility
  • Configure ResourceManager HA : Configure ResourceManager High Availaility
  • Copy data between two clusters using distcp : Copy data from one cluster to another cluster.
  • Create a snapshot of an HDFS directory : Take and create snapshot from one of HDFS directory given
  • Recover a snapshot : Recreate data from snapshot.
  • Configure HiveServer2 HA : Configure HiveServer2 High Availability

  • Install and configure Knox : Install Knox Gateway and Configure for authentication for services
  • Install and configure Ranger : Install Ranger and should be able to create policy and check the audit configuration.
  • Configure HDFS ACLs : On HDFS, you should be able to control each user directory.

* Please read faq section carefully.


Click to View What Learners Say about us : Testimonials

We have training subscriber from TCS, IBM, INFOSYS, ACCENTURE, APPLE, HEWITT, Oracle , NetApp , Capgemini etc.

Books on Spark or PDF to read : Machine Learning with Spark, Fast Data Processing with Spark (Second edition), Mastering Apache Spark, Learning Hadoop 2, Learning Real-time Processing with Spark Streaming, Apache Spark in Action, Apache Spark CookBook, Learning Spark, Advanced Analytics with Spark Download.

WhatsApp |  Call Us | Have a Query ?  |  Subscribe          

Disclaimer :
1. Hortonworks® is a registered trademark of Hortonworks.
2. Cloudera® is a registered trademark of Cloudera Inc
3. Azure® is aregistered trademark of Microsoft Inc.
4. Oracle®, Java® are registered trademark of Oracle Inc
5. SAS® is a registered trademark of SAS Inc
6. IBM® is a registered trademark of IBM Inc
7. DataStax ® is a registered trademark of DataStax
8. MapR® is a registered trademark of MapR Inc.