Deployment Environments — Run Modes
Spark Deployment Environments (aka Run Modes):
A Spark application can run locally (on a single JVM) or on a cluster (which uses a cluster manager) and the deploy mode (--deploy-mode). See spark-submit script.
Master URLs
Spark supports the following master URLs (see private object SparkMasterRegex):
-
local,local[N]andlocal[*]for Spark local -
local[N, maxRetries]for Spark local-with-retries -
local-cluster[N, cores, memory]for simulating a Spark cluster ofNexecutors (threads),coresCPUs andmemorylocally -
spark://host:port,host1:port1,…for connecting to Spark Standalone cluster(s) -
mesos://for Spark on Mesos cluster -
yarnfor Spark on YARN
You use a master URL with spark-submit as the value of --master command-line option or when creating a SparkContext using setMaster method.