Class SparkConf

Object
org.apache.spark.SparkConf
All Implemented Interfaces:
Serializable, Cloneable, org.apache.spark.internal.Logging, ReadOnlySparkConf

public class SparkConf extends Object implements ReadOnlySparkConf, Cloneable, org.apache.spark.internal.Logging, Serializable
Configuration for a Spark application. Used to set various Spark parameters as key-value pairs.

Most of the time, you would create a SparkConf object with new SparkConf(), which will load values from any spark.* Java system properties set in your application as well. In this case, parameters you set directly on the SparkConf object take priority over system properties.

For unit tests, you can also call new SparkConf(false) to skip loading external settings and get the same configuration no matter what the system properties are.

All setter methods in this class support chaining. For example, you can write new SparkConf().setMaster("local").setAppName("My app").

param: loadDefaults whether to also load values from Java system properties

See Also:
Note:
Once a SparkConf object is passed to Spark, it is cloned and can no longer be modified by the user. Spark does not support modifying the configuration at runtime.
  • Constructor Details

    • SparkConf

      public SparkConf(boolean loadDefaults)
    • SparkConf

      public SparkConf()
      Create a SparkConf that loads defaults from system properties and the classpath
  • Method Details

    • isExecutorStartupConf

      public static boolean isExecutorStartupConf(String name)
      Return whether the given config should be passed to an executor on start-up.

      Certain authentication configs are required from the executor when it connects to the scheduler, while the rest of the spark configs can be inherited from the driver later.

      Parameters:
      name - (undocumented)
      Returns:
      (undocumented)
    • isSparkPortConf

      public static boolean isSparkPortConf(String name)
      Return true if the given config matches either spark.*.port or spark.port.*.
      Parameters:
      name - (undocumented)
      Returns:
      (undocumented)
    • getDeprecatedConfig

      public static scala.Option<String> getDeprecatedConfig(String key, Map<String,String> conf)
      Looks for available deprecated keys for the given config option, and return the first value available.
      Parameters:
      key - (undocumented)
      conf - (undocumented)
      Returns:
      (undocumented)
    • logDeprecationWarning

      public static void logDeprecationWarning(String key)
      Logs a warning message if the given config key is deprecated.
      Parameters:
      key - (undocumented)
    • org$apache$spark$internal$Logging$$log_

      public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_()
    • org$apache$spark$internal$Logging$$log__$eq

      public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1)
    • LogStringContext

      public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc)
    • set

      public SparkConf set(