Spark configure.sh
Starting in the EEP 4.0 release, run
configure.sh -R
to complete your Spark configuration when manually installing
Spark or upgrading to a new version.
The command is the following:
/opt/mapr/server/configure.sh -R
In the case of Spark Standalone and Spark on YARN, this is the last step in the configuration process.
SPARK_HOME/conf/spark-defaults.conf
file:#SECURITY BLOCK
...
#END OF THE SECURITY CONFIGURATION BLOCK
Do not remove these comments from the file, as well as any other comments within the block
inserted by configure.sh
. The script uses these comments to locate security
properties.
To set ports to special values, use the spark.driver.port
and
spark.blockManager.port
properties.
Starting in EEP 6.0.0, Spark services such as the
History Server, Thrift Server, or Primary are restarted by configure.sh
only
for changes to the following Spark configuration files: spark-defaults.conf
,
spark-env.sh
, hive-site.xml
, or
log4j.properties
. If these files are unchanged,
configure.sh
does not restart any of the Spark services.
An update to Spark causes the conf
directory from the previous the Spark
version to be saved to the spark-<old-version>.<old-timestamp>
directory. If your Spark version did not change during the update, then configurations from
the spark-<old-version>.<old-timestamp>
directory is automatically
copied to the spark-<version>
directory by the
configure.sh
script.
If you use .customSecure
, at the first run, the
configure.sh
script copies the hive-site.xml
file from
Hive. For subsequent times, the hive-site.xml
file is not copied from Hive
and you would need to manually modify the $SPARK_HOME/conf/hive-site.xml
file.