Deployment Environments — Run Modes
Spark Deployment Environments (aka Run Modes):
A Spark application can run locally (on a single JVM) or on a cluster (which uses a cluster manager) and the deploy mode (--deploy-mode
). See spark-submit script.
Master URLs
Spark supports the following master URLs (see private object SparkMasterRegex):
-
local
,local[N]
andlocal[*]
for Spark local -
local[N, maxRetries]
for Spark local-with-retries -
local-cluster[N, cores, memory]
for simulating a Spark cluster ofN
executors (threads),cores
CPUs andmemory
locally -
spark://host:port,host1:port1,…
for connecting to Spark Standalone cluster(s) -
mesos://
for Spark on Mesos cluster -
yarn
for Spark on YARN
You use a master URL with spark-submit
as the value of --master
command-line option or when creating a SparkContext
using setMaster
method.