• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer navigation

The Geek Diary

  • OS
    • Linux
    • CentOS/RHEL
    • Solaris
    • Oracle Linux
    • VCS
  • Interview Questions
  • Database
    • oracle
    • oracle 12c
    • ASM
    • mysql
    • MariaDB
  • DevOps
    • Docker
    • Shell Scripting
  • Big Data
    • Hadoop
    • Cloudera
    • Hortonworks HDP

HDPCA Practice Exam Questions and AWS Instance Setup Details

by admin

Before taking the HDPCA exam, you can get the feel of the exam by using the HDPCA practice exam on AWS cloud. The practice exam is very similar to the actual exam. You can perform 6 tasks for practice on this machine. The recommended instance configuration in AWS is m3.2xlarge which has 30 GB of memory and 8 vCPUs. You can opt for the spot instances if you are on a budget.

Environment Details

The details of the setup in the HDPCA practice exam AWS machine are as below:

  • A five-node HDP cluster named horton is installed with various HDP components.
  • You are currently logged in to an Ubuntu instance as a user named horton. As the horton user, you can SSH (passwordless) onto any of the nodes in the cluster as the root user. The root password is hadoop.
  • The five nodes in the cluster are CentOS servers named namenode, resourcemanager, hiveserver, node1 and node2.
  • node1 can be seen by the other nodes in the cluster, but it is not actually a part of the cluster. The node2 instance is a part of the cluster, and it only has the Metrics Monitor component installed.
  • Ambari is available at http://namenode:8080. The username and password for Ambari are both admin.

TASK 01: Start Services

Start all of the installed services on the cluster if they are not currently running.

TASK 02: Commission a New Node

Add node1 to the horton cluster, given the following details:

  • The required SSH private key is located in the file id_rsa found in the /home/horton/Desktop folder
  • Install the DataNode, NodeManager, and Client services on node1

TASK 03: Install Services

Add Storm and Kafka to the cluster. Install all of the services on the hiveserver node, and install the Supervisor process on node2.

TASK 04: Rack Awareness

Configure rack awareness, with namenode and hiveserver on rack 01, and the other nodes on rack 02

TASK 05: ResourceManager High Availability

Setup the ResourceManager to be highly available by configuring node2 with an additional ResourceManager.

TASK 06: Decommission a Node

Decommission node1 as both a DataNode and a NodeManager.

Preparing for the The HDPCA (HDP Certified Administrator) Exam

Filed Under: Hadoop, HDPCA, Hortonworks HDP

Some more articles you might also be interested in …

  1. HDPCA Exam Objective – Create a home directory for a user and configure permissions
  2. How to configure Capacity Scheduler Queues Using YARN Queue Manager
  3. HDPCA Exam Objective – Install HDP using the Ambari install wizard
  4. HDPCA Exam Objective – Recover a snapshot
  5. HDPCA Exam Objective – Configure HiveServer2 HA ( Part 1 – Installing HiveServer )
  6. HDPCA Exam Objective – Configure a local HDP repository
  7. HDPCA Exam Objective – Add a new node to an existing cluster
  8. CCA 131 – Install CDH using Cloudera Manager
  9. CCA 131 – Perform OS-level configuration for Hadoop installation
  10. How to Create HDFS policies in Ranger

You May Also Like

Primary Sidebar

Recent Posts

  • ncat Command Examples in Linux
  • ncat: command not found
  • nautilus Command Examples in Linux
  • namei: command not found

© 2023 · The Geek Diary

  • Archives
  • Contact Us
  • Copyright