• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer navigation

The Geek Diary

  • OS
    • Linux
    • CentOS/RHEL
    • Solaris
    • Oracle Linux
    • VCS
  • Interview Questions
  • Database
    • oracle
    • oracle 12c
    • ASM
    • mysql
    • MariaDB
  • DevOps
    • Docker
    • Shell Scripting
  • Big Data
    • Hadoop
    • Cloudera
    • Hortonworks HDP

HDPCA Practice Exam Questions and AWS Instance Setup Details

by admin

Before taking the HDPCA exam, you can get the feel of the exam by using the HDPCA practice exam on AWS cloud. The practice exam is very similar to the actual exam. You can perform 6 tasks for practice on this machine. The recommended instance configuration in AWS is m3.2xlarge which has 30 GB of memory and 8 vCPUs. You can opt for the spot instances if you are on a budget.

Environment Details

The details of the setup in the HDPCA practice exam AWS machine are as below:

  • A five-node HDP cluster named horton is installed with various HDP components.
  • You are currently logged in to an Ubuntu instance as a user named horton. As the horton user, you can SSH (passwordless) onto any of the nodes in the cluster as the root user. The root password is hadoop.
  • The five nodes in the cluster are CentOS servers named namenode, resourcemanager, hiveserver, node1 and node2.
  • node1 can be seen by the other nodes in the cluster, but it is not actually a part of the cluster. The node2 instance is a part of the cluster, and it only has the Metrics Monitor component installed.
  • Ambari is available at http://namenode:8080. The username and password for Ambari are both admin.

TASK 01: Start Services

Start all of the installed services on the cluster if they are not currently running.

TASK 02: Commission a New Node

Add node1 to the horton cluster, given the following details:

  • The required SSH private key is located in the file id_rsa found in the /home/horton/Desktop folder
  • Install the DataNode, NodeManager, and Client services on node1

TASK 03: Install Services

Add Storm and Kafka to the cluster. Install all of the services on the hiveserver node, and install the Supervisor process on node2.

TASK 04: Rack Awareness

Configure rack awareness, with namenode and hiveserver on rack 01, and the other nodes on rack 02

TASK 05: ResourceManager High Availability

Setup the ResourceManager to be highly available by configuring node2 with an additional ResourceManager.

TASK 06: Decommission a Node

Decommission node1 as both a DataNode and a NodeManager.

Preparing for the The HDPCA (HDP Certified Administrator) Exam

Filed Under: Hadoop, HDPCA, Hortonworks HDP

Some more articles you might also be interested in …

  1. CCA 131 – Create/restore a snapshot of an HDFS directory (Using Cloudera Manager)
  2. HDPCA Exam Objective – Configure HDFS ACLs
  3. HDPCA Exam Objective – Configure HiveServer2 HA ( Part 1 – Installing HiveServer )
  4. CCA 131 – Add a new node to an existing cluster
  5. HDPCA Exam Objective – Install and configure Knox
  6. How to Configure Hive Authorization Using Apache Ranger
  7. HDPCA Exam Objective – Install and configure Ranger
  8. HDPCA Exam Objective – Add a new node to an existing cluster
  9. Preparing for the HDPCA (HDP Certified Administrator) Exam
  10. HDPCA Exam Objective – Add an HDP service to a cluster using Ambari

You May Also Like

Primary Sidebar

Recent Posts

  • ctags: Generates an index (or tag) file of language objects found in source files for many popular programming languages
  • csvtool: Utility to filter and extract data from CSV formatted sources
  • csvstat: Print descriptive statistics for all columns in a CSV file
  • csvsql: Generate SQL statements for a CSV file or execute those statements directly on a database

© 2023 · The Geek Diary

  • Archives
  • Contact Us
  • Copyright