Preparing Each Node

Defines minimum requirements for each node in your cluster.

Every node contributes to the cluster, so each node must be able to run HPE Ezmeral Data Fabric and Hadoop software. Nodes must meet minimum requirements for operating system, memory and disk resources, and installed software, such as Java. Including unsuitable nodes in a cluster is a major source of installation difficulty.

Table 1. Node Requirements
Component Requirements
CPU 64-bit x86
CPU Cores Minimum of 16 per CPU (see also Cluster Hardware)
OS RHEL, Oracle Linux, Rocky, SLES, or Ubuntu
Memory 32 GB minimum for nodes in production
Disk Raw, unformatted drives and no partitions
DNS Hostname, reaches all other nodes
Users Common users across all nodes; passwordless ssh (optional)
Java Must run Java 11 or 17 (see the Java Support Matrix)
Other NTP, Syslog, PAM
TIP For enhanced node performance and reliability, always set the MAPR_SUBNETS enivronment variable.

Use the subsequent sections as a checklist to make each candidate node suitable for its assigned roles. Install Data Fabric software on each node that you identify as meeting the minimum requirements.