35000+
Learners upgraded/switched career
Testimonials
All
Certifications preparation material is for renowned vendors
like
Cloudera, MapR, EMC, Databricks,SAS, Datastax, Oracle, NetApp etc ,
which has
more value, reliability and consideration in industry other than any
training
institutional certifications.
Note
: You can choose more than one product to have custome package created
from below and send email to hadoopexam@gmail.com to get discount.
Do
you know?
- Training
Access: No time
constraint and Any
future enhancements on same and subscribed training will be free.
- Question
Bank (Simulator):
Now you can have free updates
for additional or updated Questions (Any future updates on same
activation and same product code is free)
- Average score : what
is the average score of learners : 9/10 in
real exam,
even many genius have scored 10/10 after using this study material of
HadoopExam.
- Training Institute :
Do you know many of the training institutes subscribe products from
HadoopExam to train their stundents.
- Get
updates: Before
appearing in real exam, please drop an email to us. If there is any
update like new questions, new tricks, syllabus change, new tips etc.
is available with us we will share with you.
HDPCA : HDP Certification for
Hadoop Admiin (To deploy and Manage Cluster) : HDPCA
Certification exam to test HDP
administration skills om Hortonworks® data platform , this is an Hands
On exam and problem needs to be solved on multi node cluster. To give
similar experience and similar problems , HadoopExam had launched a
practice question bank containing in total 57 solved problem scenarios.
Also, complimentary video will be provided to setup 4 node c;uster
using VMWare Workstation (30 Days trial version) . Also we will be
providing videos for solving problem scenario wherever required (Only
selected problems video tutorial will be provided. Any future
enhancements on this product will be provided without any additional
fee, if this is same machine. Refer faq for more detail.

HDPCA : HDP Admin
Certification
Download
Trial Version
Contact Us After Buying To
Download or Get Full Version
admin@hadoopexam.com
hadoopexam@gmail.com
Phone : 022-42669636
Mobile : +91-8879712614
|
Regular
Price: $240.00
Discounted
$59 (Limited time only)
Note: If having trouble while credit
card payment then please create
PayPal account and then pay.
GST
: India Govt Goods and Service Tax
|
India Bank Transfer
Click Below ICICI Bank Acct. Detail
Indian
credit and Debit Card(PayuMoney)
|
All popular products for Hadoop
eco-system are combined and created packaged solution,
used by learners with :
Limited Time offer
: PACK11HDPSPRKHRTNEXM7777
Required Skills for HDPCA
: HDP Admin Certification
Installation
- Configure a
local HDP repository : It is require, you create local
repository of all the HDP software on one of the node in cluster. Once
repository setup is done, you have to use it to insall all the HDP
software in your 4 node cluster.
- Install
ambari-server and ambari-agent : This involves two steps
-
Install Ambari Server : it needs to be done only on
one of the node in cluster (You can choose, one of the master node)
-
Install Ambari Agent : Ambari agent should be installed on
all the nodes in cluster. Hence, each node can send data to Ambari
Server and can be displayed on Ambari WebUI.
- Install HDP
using the Ambari install wizard : Now create 4 node
cluster using Ambari UI. In this you have to below steps
- Choose
your cluster name
- Choose
master and slave nodes
- Configure
Local Repository
- Register
your node in cluster
- Install
Hadoop on all 4 nodes
- Add a new
node to an existing cluster : You will be given a node,
which is not yet attached to your cluster. It also gives private ssh
keys, and you need to add in your existing 4 node cluster.
- Decommission
a node : It does not mean to delete node. It means you
remove node from the cluster and clean it up (remove all Hadoop
DataNode configuration)
- Add an HDP
service to a cluster using Ambari : There more than 15
services , which can be added to HDP cluster. You need to learn how to
add each individual or all-together this services.
Configuration
- Define and
deploy a rack topology script : You shuld be able to
create Rack topology and deploy the same. Hence, accordingly datanodes
can be arranged and whatevcer data you store in cluster will be
arranged as per the defined rack topology.
- Change the
configuration of a service using Ambari : As you have more
than 15 services in cluster. You need to be able to do basic
configuration for those services.
- Configure the
Capacity Scheduler : This is one of the schedular for
scheduling the submitted jobs in cluster. Being an admin you shoul be
able to allocate proper resources to each cluster user. Hence, proper
configuration is expected.
- Create a home
directory for a user and configure permissions : Create a
directory for each unix user in HDFS. Also give respective permissions
like who can delete and update data in this HDFS directory.
- Configure the
include and exclude DataNode files :
- Troubleshooting
: If there are any issue in cluster , you shoul be able to find and fix
them. More detail will be given in practice questions.
- Restart an
HDP service : Once you have added the services in cluster.
You change their configuration and needs to restart the same.
- View an
application’s log file : For each service , you should
find the log path and see if there is any issue.
- Configure and
manage alerts : You should be abkle to create new alert
configuration.
- Troubleshoot
a failed job : If a user had submitted a Job and it got
failed. You should be able to find. Why this Job has failed.
High
Availability
- Configure NameNode HA
: Configure namenode High Availaility
- Configure ResourceManager HA
: Configure ResourceManager High Availaility
- Copy data between two clusters
using distcp : Copy data from one cluster to another
cluster.
- Create a snapshot of an HDFS
directory : Take and create snapshot from one of HDFS
directory given
- Recover a snapshot :
Recreate data from snapshot.
- Configure HiveServer2 HA
: Configure HiveServer2 High Availability
Security
- Install and configure Knox
: Install Knox Gateway and Configure for authentication for services
- Install and configure Ranger
: Install Ranger and should be able to create policy and check the
audit configuration.
- Configure HDFS ACLs :
On HDFS, you should be able to control each user directory.
* Please read faq section carefully.
_______________________________________________________________________________________________________
Click to View What Learners Say
about us : Testimonials
We have training subscriber from TCS,
IBM, INFOSYS, ACCENTURE, APPLE, HEWITT, Oracle , NetApp , Capgemini
etc.
Books
on Spark or PDF to read : Machine Learning with Spark, Fast Data
Processing with Spark (Second edition), Mastering Apache Spark,
Learning Hadoop 2, Learning Real-time Processing with Spark Streaming,
Apache Spark in Action, Apache Spark CookBook, Learning Spark, Advanced
Analytics with Spark Download.