$ ./bin/spark-submit
Usage: spark-submit [options] <app jar | python file> [app arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]
...
Spark Submit — spark-submit
shell script
spark-submit
shell script allows you to manage your Spark applications.
You can submit your Spark application to a Spark deployment environment for execution, kill or request status of Spark applications.
You can find spark-submit
script in bin
directory of the Spark distribution.
When executed, spark-submit
script first checks whether SPARK_HOME
environment variable is set and sets it to the directory that contains bin/spark-submit
shell script if not. It then executes spark-class
shell script to run SparkSubmit
standalone application.
Caution
|
FIXME Add Cluster Manager and Deploy Mode to the table below (see options value)
|
Command-Line Option | Spark Property | Environment Variable | Description | Internal Property |
---|---|---|---|---|
|
Defaults to |
|||
|
|
|||
|
|
|||
|
|
|
Deploy mode |
|
|
|
The driver’s class path |
|
|
|
|
The driver’s JVM options |
|
|
|
|
The driver’s native library path |
|
|
|
|
|
The driver’s memory |
|
|
|
|
||
|
|
|
||
|
|
|
The number of executor CPU cores |
|
|
|
|
An executor’s memory |
|
|
|
|
||
|
|
|||
|
|
|
||
|
|
|
||
|
|
|||
|
|
|
Master URL. Defaults to |
|
|
|
|||
|
|
|
Uses |
|
|
|
|
||
|
|
|||
|
|
|
||
|
|
|
||
|
|
|||
|
|
|||
|
|
|||
|
|
|||
|
|
|||
|
|
|||
|
|
|
||
|
|
|||
|
|
|||
|
|
|||
|
|
Tip
|
Set
Refer to Print Launch Command of Spark Scripts (or |
Tip
|
Avoid using Refer to Executing Main — |
Preparing Submit Environment — prepareSubmitEnvironment
Internal Method
prepareSubmitEnvironment(args: SparkSubmitArguments)
: (Seq[String], Seq[String], Map[String, String], String)
prepareSubmitEnvironment
creates a 4-element tuple, i.e. (childArgs, childClasspath, sysProps, childMainClass)
.
Element | Description |
---|---|
|
Arguments |
|
Classpath elements |
|
|
|
Main class |
prepareSubmitEnvironment
uses options
to…
Caution
|
FIXME |
Note
|
prepareSubmitEnvironment is used in SparkSubmit object.
|
Tip
|
See the elements of the return tuple using --verbose command-line option.
|
Custom Spark Properties File — --properties-file
command-line option
--properties-file [FILE]
--properties-file
command-line option sets the path to a file FILE
from which Spark loads extra Spark properties.
Tip
|
Spark uses conf/spark-defaults.conf by default. |
Driver Cores in Cluster Deploy Mode — --driver-cores
command-line option
--driver-cores NUM
--driver-cores
command-line option sets the number of cores to NUM
for the driver in the cluster deploy mode.
Note
|
--driver-cores switch is only available for cluster mode (for Standalone, Mesos, and YARN).
|
Note
|
It corresponds to spark.driver.cores setting. |
Note
|
It is printed out to the standard error output in verbose mode. |
Additional JAR Files to Distribute — --jars
command-line option
--jars JARS
--jars
is a comma-separated list of local jars to include on the driver’s and executors' classpaths.
Caution
|
FIXME |
Additional Archives to Distribute — --archives
command-line option
--archives ARCHIVES
Caution
|
FIXME |
Specifying YARN Resource Queue — --queue
command-line option
--queue QUEUE_NAME
With --queue
you can choose the YARN resource queue to submit a Spark application to. The default queue name is default
.
Caution
|
FIXME What is a queue ?
|
Note
|
It corresponds to spark.yarn.queue Spark’s setting. |
Tip
|
It is printed out to the standard error output in verbose mode. |
Actions
Submitting Applications for Execution — submit
method
The default action of spark-submit
script is to submit a Spark application to a deployment environment for execution.
Tip
|
Use --verbose command-line switch to know the main class to be executed, arguments, system properties, and classpath (to ensure that the command-line arguments and switches were processed properly). |
When executed, spark-submit
executes submit
method.
submit(args: SparkSubmitArguments): Unit
If proxyUser
is set it will…FIXME
Caution
|
FIXME Review why and when to use proxyUser .
|
It passes the execution on to runMain.
Executing Main — runMain
internal method
runMain(
childArgs: Seq[String],
childClasspath: Seq[String],
sysProps: Map[String, String],
childMainClass: String,
verbose: Boolean): Unit
runMain
is an internal method to build execution environment and invoke the main method of the Spark application that has been submitted for execution.
Note
|
It is exclusively used when submitting applications for execution. |
When verbose
input flag is enabled (i.e. true
) runMain
prints out all the input parameters, i.e. childMainClass
, childArgs
, sysProps
, and childClasspath
(in that order).
Main class:
[childMainClass]
Arguments:
[childArgs one per line]
System properties:
[sysProps one per line]
Classpath elements:
[childClasspath one per line]
Note
|
Use spark-submit 's --verbose command-line option to enable verbose flag.
|
runMain
builds the context classloader (as loader
) depending on spark.driver.userClassPathFirst
flag.
Caution
|
FIXME Describe spark.driver.userClassPathFirst
|
It adds the jars specified in childClasspath
input parameter to the context classloader (that is later responsible for loading the childMainClass
main class).
Note
|
childClasspath input parameter corresponds to --jars command-line option with the primary resource if specified in client deploy mode.
|
It sets all the system properties specified in sysProps
input parameter (using Java’s System.setProperty method).
It creates an instance of childMainClass
main class (as mainClass
).
Note
|
childMainClass is the main class spark-submit has been invoked with.
|
Tip
|
Avoid using scala.App trait for a Spark application’s main class in Scala as reported in SPARK-4170 Closure problems when running Scala app that "extends App".
|
If you use scala.App
for the main class, you should see the following warning message in the logs:
Warning: Subclasses of scala.App may not work correctly. Use a main() method instead.
Finally, runMain
executes the main
method of the Spark application passing in the childArgs
arguments.
Any SparkUserAppException
exceptions lead to System.exit
while the others are simply re-thrown.
Adding Local Jars to ClassLoader — addJarToClasspath
internal method
addJarToClasspath(localJar: String, loader: MutableURLClassLoader)
addJarToClasspath
is an internal method to add file
or local
jars (as localJar
) to the loader
classloader.
Internally, addJarToClasspath
resolves the URI of localJar
. If the URI is file
or local
and the file denoted by localJar
exists, localJar
is added to loader
. Otherwise, the following warning is printed out to the logs:
Warning: Local jar /path/to/fake.jar does not exist, skipping.
For all other URIs, the following warning is printed out to the logs:
Warning: Skip remote jar hdfs://fake.jar.
Note
|
addJarToClasspath assumes file URI when localJar has no URI specified, e.g. /path/to/local.jar .
|
Caution
|
FIXME What is a URI fragment? How does this change re YARN distributed cache? See Utils#resolveURI .
|
Command-line Options
Execute spark-submit --help
to know about the command-line options supported.
➜ spark git:(master) ✗ ./bin/spark-submit --help
Usage: spark-submit [options] <app jar | python file> [app arguments]
Usage: spark-submit --kill [submission ID] --master [spark://...]
Usage: spark-submit --status [submission ID] --master [spark://...]
Usage: spark-submit run-example [options] example-class [example args]
Options:
--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.
--deploy-mode DEPLOY_MODE Whether to launch the driver program locally ("client") or
on one of the worker machines inside the cluster ("cluster")
(Default: client).
--class CLASS_NAME Your application's main class (for Java / Scala apps).
--name NAME A name of your application.
--jars JARS Comma-separated list of local jars to include on the driver
and executor classpaths.
--packages Comma-separated list of maven coordinates of jars to include
on the driver and executor classpaths. Will search the local
maven repo, then maven central and any additional remote
repositories given by --repositories. The format for the
coordinates should be groupId:artifactId:version.
--exclude-packages Comma-separated list of groupId:artifactId, to exclude while
resolving the dependencies provided in --packages to avoid
dependency conflicts.
--repositories Comma-separated list of additional remote repositories to
search for the maven coordinates given with --packages.
--py-files PY_FILES Comma-separated list of .zip, .egg, or .py files to place
on the PYTHONPATH for Python apps.
--files FILES Comma-separated list of files to be placed in the working
directory of each executor.
--conf PROP=VALUE Arbitrary Spark configuration property.
--properties-file FILE Path to a file from which to load extra properties. If not
specified, this will look for conf/spark-defaults.conf.
--driver-memory MEM Memory for driver (e.g. 1000M, 2G) (Default: 1024M).
--driver-java-options Extra Java options to pass to the driver.
--driver-library-path Extra library path entries to pass to the driver.
--driver-class-path Extra class path entries to pass to the driver. Note that
jars added with --jars are automatically included in the
classpath.
--executor-memory MEM Memory per executor (e.g. 1000M, 2G) (Default: 1G).
--proxy-user NAME User to impersonate when submitting the application.
This argument does not work with --principal / --keytab.
--help, -h Show this help message and exit.
--verbose, -v Print additional debug output.
--version, Print the version of current Spark.
Spark standalone with cluster deploy mode only:
--driver-cores NUM Cores for driver (Default: 1).
Spark standalone or Mesos with cluster deploy mode only:
--supervise If given, restarts the driver on failure.
--kill SUBMISSION_ID If given, kills the driver specified.
--status SUBMISSION_ID If given, requests the status of the driver specified.
Spark standalone and Mesos only:
--total-executor-cores NUM Total cores for all executors.
Spark standalone and YARN only:
--executor-cores NUM Number of cores per executor. (Default: 1 in YARN mode,
or all available cores on the worker in standalone mode)
YARN-only:
--driver-cores NUM Number of cores used by the driver, only in cluster mode
(Default: 1).
--queue QUEUE_NAME The YARN queue to submit to (Default: "default").
--num-executors NUM Number of executors to launch (Default: 2).
--archives ARCHIVES Comma separated list of archives to be extracted into the
working directory of each executor.
--principal PRINCIPAL Principal to be used to login to KDC, while running on
secure HDFS.
--keytab KEYTAB The full path to the file that contains the keytab for the
principal specified above. This keytab will be copied to
the node running the Application Master via the Secure
Distributed Cache, for renewing the login tickets and the
delegation tokens periodically.
-
--class
-
--conf
or-c
-
--deploy-mode
(see Deploy Mode) -
--driver-class-path
(see--driver-class-path
command-line option) -
--driver-cores
(see Driver Cores in Cluster Deploy Mode) -
--driver-java-options
-
--driver-library-path
-
--driver-memory
-
--executor-memory
-
--files
-
--jars
-
--kill
for Standalone cluster mode only -
--master
-
--name
-
--packages
-
--exclude-packages
-
--properties-file
(see Custom Spark Properties File) -
--proxy-user
-
--py-files
-
--repositories
-
--status
for Standalone cluster mode only -
--total-executor-cores
List of switches, i.e. command-line options that do not take parameters:
-
--help
or-h
-
--supervise
for Standalone cluster mode only -
--usage-error
-
--verbose
or-v
(see Verbose Mode) -
--version
(see Version)
YARN-only options:
-
--archives
-
--executor-cores
-
--keytab
-
--num-executors
-
--principal
-
--queue
(see Specifying YARN Resource Queue (--queue switch))
--driver-class-path
command-line option
--driver-class-path
command-line option sets the extra class path entries (e.g. jars and directories) that should be added to a driver’s JVM.
Tip
|
You should use --driver-class-path in client deploy mode (not SparkConf) to ensure that the CLASSPATH is set up with the entries. client deploy mode uses the same JVM for the driver as spark-submit 's.
|
--driver-class-path
sets the internal driverExtraClassPath
property (when SparkSubmitArguments.handle called).
It works for all cluster managers and deploy modes.
If driverExtraClassPath
not set on command-line, the spark.driver.extraClassPath setting is used.
Note
|
Command-line options (e.g. --driver-class-path ) have higher precedence than their corresponding Spark settings in a Spark properties file (e.g. spark.driver.extraClassPath ). You can therefore control the final settings by overriding Spark settings on command line using the command-line options.
|
Setting / System Property | Command-Line Option | Description |
---|---|---|
|
Extra class path entries (e.g. jars and directories) to pass to a driver’s JVM. |
Version — --version
command-line option
$ ./bin/spark-submit --version
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.0-SNAPSHOT
/_/
Branch master
Compiled by user jacek on 2016-09-30T07:08:39Z
Revision 1fad5596885aab8b32d2307c0edecbae50d5bd7a
Url https://github.com/apache/spark.git
Type --help for more information.
Verbose Mode — --verbose
command-line option
When spark-submit
is executed with --verbose
command-line option, it enters verbose mode.
In verbose mode, the parsed arguments are printed out to the System error output.
FIXME
It also prints out propertiesFile
and the properties from the file.
FIXME
Deploy Mode — --deploy-mode
command-line option
You use spark-submit’s --deploy-mode
command-line option to specify the deploy mode for a Spark application.
Environment Variables
The following is the list of environment variables that are considered when command-line options are not specified:
-
MASTER
for--master
-
SPARK_DRIVER_MEMORY
for--driver-memory
-
SPARK_EXECUTOR_MEMORY
(see Environment Variables in the SparkContext document) -
SPARK_EXECUTOR_CORES
-
DEPLOY_MODE
-
SPARK_YARN_APP_NAME
-
_SPARK_CMD_USAGE
External packages and custom repositories
The spark-submit
utility supports specifying external packages using Maven coordinates using --packages
and custom repositories using --repositories
.
./bin/spark-submit \
--packages my:awesome:package \
--repositories s3n://$aws_ak:$aws_sak@bucket/path/to/repo
FIXME Why should I care?
SparkSubmit
Standalone Application — main
method
Tip
|
The source code of the script lives in https://github.com/apache/spark/blob/master/bin/spark-submit. |
When executed, spark-submit
script simply passes the call to spark-class with org.apache.spark.deploy.SparkSubmit
class followed by command-line arguments.
Tip
|
|
It creates an instance of SparkSubmitArguments.
If in verbose mode, it prints out the application arguments.
It then relays the execution to action-specific internal methods (with the application arguments):
-
When no action was explicitly given, it is assumed submit action.
-
kill (when
--kill
switch is used) -
requestStatus (when
--status
switch is used)
Note
|
The action can only have one of the three available values: SUBMIT , KILL , or REQUEST_STATUS .
|
Calculating Current Spark Properties — loadEnvironmentArguments
internal method
loadEnvironmentArguments(): Unit
loadEnvironmentArguments
internal method calculates the settings for the current execution of spark-submit.
loadEnvironmentArguments
reads command-line options first followed by Spark properties and System’s environment variables.
Note
|
Spark config properties start with spark. prefix and can be set using --conf [key=value] command-line option.
|
spark-env.sh - load additional environment settings
-
spark-env.sh
consists of environment settings to configure Spark for your site.export JAVA_HOME=/your/directory/java export HADOOP_HOME=/usr/lib/hadoop export SPARK_WORKER_CORES=2 export SPARK_WORKER_MEMORY=1G
-
spark-env.sh
is loaded at the startup of Spark’s command line scripts. -
SPARK_ENV_LOADED
env var is to ensure thespark-env.sh
script is loaded once. -
SPARK_CONF_DIR
points at the directory withspark-env.sh
or$SPARK_HOME/conf
is used. -
spark-env.sh
is executed if it exists. -
$SPARK_HOME/conf
directory hasspark-env.sh.template
file that serves as a template for your own custom configuration.
Consult Environment Variables in the official documentation.