Configuring Memory for Spark Applications

Describes how to set memory options for Spark applications.

You can configure the driver and executor memory options for the Spark applications by using HPE Ezmeral Unified Analytics Software. See Creating Spark Applications.

You can configure the driver and executor memory options for the Spark applications by manually setting the following properties in the Spark application YAML file. See Spark application YAML.

  • spark.driver.memory: Amount of memory allocated for the driver.
  • spark.executor.memory: Amount of memory allocated for each executor that runs the task.
However, there is an added memory overhead of 10% of the configured driver or executor memory, which is at least 384 MB. The memory overhead is per executor and driver. Thus, the total driver or executor memory includes the driver or executor memory and overhead.

Memory Overhead = 0.1 * Driver or Executor Memory (minimum of 384 MB)

Total Driver or Executor Memory = Driver or Executor Memory + Memory Overhead

Configuring Memory Overhead

You can configure the memory overhead for driver and executor in HPE Ezmeral Unified Analytics Software.

Set the following configurations options in the Spark application YAML file by clicking Edit YAML in Review step or Edit YAML from the Actions menu on Spark Applications screen. See Managing Spark Applications.
spark.driver.memoryOverhead
spark.executor.memoryOverhead

To learn more about driver or executor memory, memory overhead, and other properties, see Apache Spark 3.x.x application properties.