Configuring Memory for Spark Applications
Describes how to set memory options for Spark applications.
You can configure the driver and executor memory options for the Spark applications by using HPE Ezmeral Unified Analytics Software. See Creating Spark Applications.
You can configure the driver and executor memory options for the Spark applications by manually setting the following properties in the Spark application YAML file. See Spark application YAML.
spark.driver.memory
: Amount of memory allocated for the driver.spark.executor.memory
: Amount of memory allocated for each executor that runs the task.
Memory Overhead = 0.1 * Driver or Executor Memory (minimum of 384 MB)
Total Driver or Executor Memory = Driver or Executor Memory + Memory Overhead
Configuring Memory Overhead
You can configure the memory overhead for driver and executor in HPE Ezmeral Unified Analytics Software.
spark.driver.memoryOverhead
spark.executor.memoryOverhead
To learn more about driver or executor memory, memory overhead, and other properties, see Apache Spark 3.x.x application properties.