org.apache.spark.sql.internal

SQLConf

class SQLConf extends Serializable with Logging

A class that enables the setting and getting of mutable config parameters/hints.

In the presence of a SQLContext, these can be set and queried by passing SET commands into Spark SQL's query functions (i.e. sql()). Otherwise, users of this class can modify the hints by programmatically calling the setters and getters of this class.

SQLConf is thread-safe (internally synchronized, so safe to be used in multiple threads).

Linear Supertypes
Logging, Serializable, Serializable, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SQLConf
  2. Logging
  3. Serializable
  4. Serializable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SQLConf()

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def adaptiveExecutionEnabled: Boolean

  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def autoBroadcastJoinThreshold: Long

  9. def broadcastTimeout: Int

  10. def bucketingEnabled: Boolean

  11. def caseSensitiveAnalysis: Boolean

  12. def caseSensitiveInferenceMode: SQLConf.HiveCaseSensitiveInferenceMode.Value

  13. def checkpointLocation: Option[String]

  14. def clear(): Unit

  15. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  16. def columnBatchSize: Int

  17. def columnNameOfCorruptRecord: String

  18. def contains(key: String): Boolean

    Return whether a given key is set in this SQLConf.

  19. def convertCTAS: Boolean

  20. def crossJoinEnabled: Boolean

  21. def dataFramePivotMaxValues: Int

  22. def dataFrameRetainGroupColumns: Boolean

  23. def dataFrameSelfJoinAutoResolveAmbiguity: Boolean

  24. def defaultDataSourceName: String

  25. def defaultSizeInBytes: Long

  26. def enableRadixSort: Boolean

  27. def enableTwoLevelAggMap: Boolean

  28. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  29. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  30. def exchangeReuseEnabled: Boolean

  31. def fallBackToHdfsForStatsEnabled: Boolean

  32. def fileCommitProtocolClass: String

  33. def fileSinkLogCleanupDelay: Long

  34. def fileSinkLogCompactInterval: Int

  35. def fileSinkLogDeletion: Boolean

  36. def fileSourceLogCleanupDelay: Long

  37. def fileSourceLogCompactInterval: Int

  38. def fileSourceLogDeletion: Boolean

  39. def filesMaxPartitionBytes: Long

  40. def filesOpenCostInBytes: Long

  41. def filesourcePartitionFileCacheSize: Long

  42. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  43. def gatherFastStats: Boolean

  44. def getAllConfs: Map[String, String]

    Return all the configuration properties that have been set (i.

    Return all the configuration properties that have been set (i.e. not the default). This creates a new copy of the config properties in the form of a Map.

  45. def getAllDefinedConfs: Seq[(String, String, String)]

    Return all the configuration definitions that have been defined in SQLConf.

    Return all the configuration definitions that have been defined in SQLConf. Each definition contains key, defaultValue and doc.

  46. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  47. def getConf[T](entry: OptionalConfigEntry[T]): Option[T]

    Return the value of an optional Spark SQL configuration property for the given key.

    Return the value of an optional Spark SQL configuration property for the given key. If the key is not set yet, returns None.

  48. def getConf[T](entry: ConfigEntry[T]): T

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue in ConfigEntry.

  49. def getConf[T](entry: ConfigEntry[T], defaultValue: T): T

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue. This is useful when defaultValue in ConfigEntry is not the desired one.

  50. def getConfString(key: String, defaultValue: String): String

    Return the string value of Spark SQL configuration property for the given key.

    Return the string value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue.

  51. def getConfString(key: String): String

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key.

    Annotations
    @throws( "if key is not set" )
  52. def groupByOrdinal: Boolean

  53. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  54. def ignoreCorruptFiles: Boolean

  55. def inMemoryPartitionPruning: Boolean

  56. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Attributes
    protected
    Definition Classes
    Logging
  57. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  58. def isParquetBinaryAsString: Boolean

  59. def isParquetINT96AsTimestamp: Boolean

  60. def isParquetSchemaMergingEnabled: Boolean

  61. def isParquetSchemaRespectSummaries: Boolean

  62. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  63. def isUnsupportedOperationCheckEnabled: Boolean

  64. def limitScaleUpFactor: Int

  65. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  66. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  67. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  68. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  69. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  70. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  71. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  72. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  73. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  74. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  75. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  76. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  77. def manageFilesourcePartitions: Boolean

  78. def maxCaseBranchesForCodegen: Int

  79. def metastorePartitionPruning: Boolean

  80. def minBatchesToRetain: Int

  81. def minNumPostShufflePartitions: Int

  82. def ndvMaxError: Double

  83. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  84. final def notify(): Unit

    Definition Classes
    AnyRef
  85. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  86. def numShufflePartitions: Int

  87. def optimizerInSetConversionThreshold: Int

  88. def optimizerMaxIterations: Int

    ************************ Spark SQL Params/Hints *******************

  89. def optimizerMetadataOnly: Boolean

  90. def orcFilterPushDown: Boolean

  91. def orderByOrdinal: Boolean

  92. def parallelPartitionDiscoveryParallelism: Int

  93. def parallelPartitionDiscoveryThreshold: Int

  94. def parquetCacheMetadata: Boolean

  95. def parquetCompressionCodec: String

  96. def parquetFilterPushDown: Boolean

  97. def parquetOutputCommitterClass: String

  98. def parquetVectorizedReaderEnabled: Boolean

  99. def partitionColumnTypeInferenceEnabled: Boolean

  100. def preferSortMergeJoin: Boolean

  101. def resolver: (String, String) ⇒ Boolean

    Returns the Resolver for the current configuration, which can be used to determine if two identifiers are equal.

  102. def runSQLonFile: Boolean

  103. def setConf[T](entry: ConfigEntry[T], value: T): Unit

    Set the given Spark SQL configuration property.

  104. def setConf(props: Properties): Unit

    Set Spark SQL configuration properties.

  105. def setConfString(key: String, value: String): Unit

    Set the given Spark SQL configuration property using a string value.

  106. val settings: Map[String, String]

    Only low degree of contention is expected for conf, thus NOT using ConcurrentHashMap.

    Only low degree of contention is expected for conf, thus NOT using ConcurrentHashMap.

    Attributes
    protected[org.apache.spark]
  107. def stateStoreMinDeltasForSnapshot: Int

  108. def streamingFileCommitProtocolClass: String

  109. def streamingMetricsEnabled: Boolean

  110. def streamingNoDataProgressEventInterval: Long

  111. def streamingPollingDelay: Long

  112. def streamingProgressRetention: Int

  113. def streamingSchemaInference: Boolean

  114. def subexpressionEliminationEnabled: Boolean

  115. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  116. def targetPostShuffleInputSize: Long

  117. def toString(): String

    Definition Classes
    AnyRef → Any
  118. def unsetConf(entry: ConfigEntry[_]): Unit

  119. def unsetConf(key: String): Unit

  120. def useCompression: Boolean

  121. def variableSubstituteDepth: Int

  122. def variableSubstituteEnabled: Boolean

  123. def verifyPartitionPath: Boolean

  124. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  125. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  126. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  127. def warehousePath: String

  128. def wholeStageEnabled: Boolean

  129. def wholeStageFallback: Boolean

  130. def wholeStageMaxNumFields: Int

  131. def writeLegacyParquetFormat: Boolean

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped