Run Spark from the Spark Shell

About this task

In yarn-client mode, complete the following steps to run Spark from the Spark shell:

Procedure

  1. Navigate to the Spark-on-YARN installation directory, and insert your Spark version into the command.
    cd /opt/mapr/spark/spark-<version>/
  2. Issue the following command to run Spark from the Spark shell:
    • On Spark 2.0.1 and later:
      ./bin/spark-shell --master yarn --deploy-mode client
    • On Spark 1.6.1:
      MASTER=yarn-client ./bin/spark-shell
    NOTE
    You must use yarn-client mode to run Spark from the Spark shell. The yarn-cluster mode is not supported.