Database Reference
In-Depth Information
Example 8-2. Creating an application using a SparkConf in Scala
// Construct a conf
val conf = new SparkConf ()
conf . set ( "spark.app.name" , "My Spark App" )
conf . set ( "spark.master" , "local[4]" )
conf . set ( "spark.ui.port" , "36000" ) // Override the default port
// Create a SparkContext with this configuration
val sc = new SparkContext ( conf )
Example 8-3. Creating an application using a SparkConf in Java
// Construct a conf
SparkConf conf = new SparkConf ();
conf . set ( "spark.app.name" , "My Spark App" );
conf . set ( "spark.master" , "local[4]" );
conf . set ( "spark.ui.port" , "36000" ); // Override the default port
// Create a SparkContext with this configuration
JavaSparkContext sc = JavaSparkContext ( conf );
The SparkConf class is quite simple: a SparkConf instance contains key/value pairs of
configuration options the user would like to override. Every configuration option in
Spark is based on a string key and value. To use a SparkConf object you create one,
call set() to add configuration values, and then supply it to the SparkContext con‐
structor. In addition to set() , the SparkConf class includes a small number of utility
methods for setting common parameters. In the preceding three examples, you could
also call setAppName() and setMaster() to set the spark.app.name and the
spark.master configurations, respectively.
In these examples, the SparkConf values are set programmatically in the application
code. In many cases, it is more convenient to populate configurations dynamically for
a given application. Spark allows setting configurations dynamically through the
spark-submit tool. When an application is launched with spark-submit , it injects
configuration values into the environment. These are detected and automatically fil‐
led in when a new SparkConf is constructed. Therefore, user applications can simply
construct an “empty” SparkConf and pass it directly to the SparkContext constructor
if you are using spark-submit .
The spark-submit tool provides built-in flags for the most common Spark configura‐
tion parameters and a generic --conf flag that accepts any Spark configuration value.
These are demonstrated in Example 8-4 .
Search WWH ::




Custom Search