org.apache.spark.sql.internal

SQLConf

class SQLConf extends Serializable with Logging

A class that enables the setting and getting of mutable config parameters/hints.

In the presence of a SQLContext, these can be set and queried by passing SET commands into Spark SQL's query functions (i.e. sql()). Otherwise, users of this class can modify the hints by programmatically calling the setters and getters of this class.

SQLConf is thread-safe (internally synchronized, so safe to be used in multiple threads).

Linear Supertypes
Logging, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SQLConf
  2. Logging
  3. Serializable
  4. Serializable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SQLConf()

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. def adaptiveExecutionEnabled: Boolean

  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. def autoBroadcastJoinThreshold: Long

  9. def broadcastTimeout: Int

  10. def bucketingEnabled: Boolean

  11. def cartesianProductExecBufferInMemoryThreshold: Int

  12. def cartesianProductExecBufferSpillThreshold: Int

  13. def caseSensitiveAnalysis: Boolean

  14. def caseSensitiveInferenceMode: SQLConf.HiveCaseSensitiveInferenceMode.Value

  15. def cboEnabled: Boolean

  16. def checkpointLocation: Option[String]

  17. def clear(): Unit

  18. def clone(): SQLConf

    Definition Classes
    SQLConf → AnyRef
  19. def columnBatchSize: Int

  20. def columnNameOfCorruptRecord: String

  21. def constraintPropagationEnabled: Boolean

  22. def contains(key: String): Boolean

    Return whether a given key is set in this SQLConf.

  23. def convertCTAS: Boolean

  24. def copy(entries: (ConfigEntry[_], Any)*): SQLConf

  25. def crossJoinEnabled: Boolean

  26. def dataFramePivotMaxValues: Int

  27. def dataFrameRetainGroupColumns: Boolean

  28. def dataFrameSelfJoinAutoResolveAmbiguity: Boolean

  29. def defaultDataSourceName: String

  30. def defaultSizeInBytes: Long

  31. def enableRadixSort: Boolean

  32. def enableTwoLevelAggMap: Boolean

  33. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  34. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  35. def escapedStringLiterals: Boolean

  36. def exchangeReuseEnabled: Boolean

  37. def fallBackToHdfsForStatsEnabled: Boolean

  38. def fileCommitProtocolClass: String

  39. def fileSinkLogCleanupDelay: Long

  40. def fileSinkLogCompactInterval: Int

  41. def fileSinkLogDeletion: Boolean

  42. def fileSourceLogCleanupDelay: Long

  43. def fileSourceLogCompactInterval: Int

  44. def fileSourceLogDeletion: Boolean

  45. def filesMaxPartitionBytes: Long

  46. def filesOpenCostInBytes: Long

  47. def filesourcePartitionFileCacheSize: Long

  48. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  49. def gatherFastStats: Boolean

  50. def getAllConfs: Map[String, String]

    Return all the configuration properties that have been set (i.

    Return all the configuration properties that have been set (i.e. not the default). This creates a new copy of the config properties in the form of a Map.

  51. def getAllDefinedConfs: Seq[(String, String, String)]

    Return all the configuration definitions that have been defined in SQLConf.

    Return all the configuration definitions that have been defined in SQLConf. Each definition contains key, defaultValue and doc.

  52. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  53. def getConf[T](entry: OptionalConfigEntry[T]): Option[T]

    Return the value of an optional Spark SQL configuration property for the given key.

    Return the value of an optional Spark SQL configuration property for the given key. If the key is not set yet, returns None.

  54. def getConf[T](entry: ConfigEntry[T]): T

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue in ConfigEntry.

  55. def getConf[T](entry: ConfigEntry[T], defaultValue: T): T

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue. This is useful when defaultValue in ConfigEntry is not the desired one.

  56. def getConfString(key: String, defaultValue: String): String

    Return the string value of Spark SQL configuration property for the given key.

    Return the string value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue.

  57. def getConfString(key: String): String

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key.

    Annotations
    @throws( "if key is not set" )
  58. def groupByAliases: Boolean

  59. def groupByOrdinal: Boolean

  60. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  61. def hiveThriftServerSingleSession: Boolean

  62. def ignoreCorruptFiles: Boolean

  63. def inMemoryPartitionPruning: Boolean

  64. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Attributes
    protected
    Definition Classes
    Logging
  65. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  66. def isParquetBinaryAsString: Boolean

  67. def isParquetINT64AsTimestampMillis: Boolean

  68. def isParquetINT96AsTimestamp: Boolean

  69. def isParquetSchemaMergingEnabled: Boolean

  70. def isParquetSchemaRespectSummaries: Boolean

  71. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  72. def isUnsupportedOperationCheckEnabled: Boolean

  73. def joinReorderCardWeight: Double

  74. def joinReorderDPStarFilter: Boolean

  75. def joinReorderDPThreshold: Int

  76. def joinReorderEnabled: Boolean

  77. def limitScaleUpFactor: Int

  78. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  79. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  80. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  81. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  82. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  83. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  84. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  85. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  86. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  87. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  88. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  89. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  90. def manageFilesourcePartitions: Boolean

  91. def maxCaseBranchesForCodegen: Int

  92. def maxNestedViewDepth: Int

  93. def maxRecordsPerFile: Long

  94. def metastorePartitionPruning: Boolean

  95. def minBatchesToRetain: Int

  96. def minNumPostShufflePartitions: Int

  97. def ndvMaxError: Double

  98. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  99. final def notify(): Unit

    Definition Classes
    AnyRef
  100. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  101. def numShufflePartitions: Int

  102. def objectAggSortBasedFallbackThreshold: Int

  103. def optimizerInSetConversionThreshold: Int

  104. def optimizerMaxIterations: Int

    ************************ Spark SQL Params/Hints *******************

  105. def optimizerMetadataOnly: Boolean

  106. def orcFilterPushDown: Boolean

  107. def orderByOrdinal: Boolean

  108. def parallelPartitionDiscoveryParallelism: Int

  109. def parallelPartitionDiscoveryThreshold: Int

  110. def parquetCacheMetadata: Boolean

  111. def parquetCompressionCodec: String

  112. def parquetFilterPushDown: Boolean

  113. def parquetOutputCommitterClass: String

  114. def parquetVectorizedReaderEnabled: Boolean

  115. def partitionColumnTypeInferenceEnabled: Boolean

  116. def preferSortMergeJoin: Boolean

  117. def redactOptions(options: Map[String, String]): Map[String, String]

    Redacts the given option map according to the description of SQL_OPTIONS_REDACTION_PATTERN.

  118. def resolver: (String, String) ⇒ Boolean

    Returns the Resolver for the current configuration, which can be used to determine if two identifiers are equal.

  119. def runSQLonFile: Boolean

  120. def sessionLocalTimeZone: String

  121. def setConf[T](entry: ConfigEntry[T], value: T): Unit

    Set the given Spark SQL configuration property.

  122. def setConf(props: Properties): Unit

    Set Spark SQL configuration properties.

  123. def setConfString(key: String, value: String): Unit

    Set the given Spark SQL configuration property using a string value.

  124. val settings: Map[String, String]

    Only low degree of contention is expected for conf, thus NOT using ConcurrentHashMap.

    Only low degree of contention is expected for conf, thus NOT using ConcurrentHashMap.

    Attributes
    protected[org.apache.spark]
  125. def sortMergeJoinExecBufferInMemoryThreshold: Int

  126. def sortMergeJoinExecBufferSpillThreshold: Int

  127. def starSchemaDetection: Boolean

  128. def starSchemaFTRatio: Double

  129. def stateStoreMinDeltasForSnapshot: Int

  130. def streamingFileCommitProtocolClass: String

  131. def streamingMetricsEnabled: Boolean

  132. def streamingNoDataProgressEventInterval: Long

  133. def streamingPollingDelay: Long

  134. def streamingProgressRetention: Int

  135. def streamingSchemaInference: Boolean

  136. def subexpressionEliminationEnabled: Boolean

  137. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  138. def tableRelationCacheSize: Int

  139. def targetPostShuffleInputSize: Long

  140. def toString(): String

    Definition Classes
    AnyRef → Any
  141. def unsetConf(entry: ConfigEntry[_]): Unit

  142. def unsetConf(key: String): Unit

  143. def useCompression: Boolean

  144. def useObjectHashAggregation: Boolean

  145. def variableSubstituteDepth: Int

  146. def variableSubstituteEnabled: Boolean

  147. def verifyPartitionPath: Boolean

  148. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  149. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  150. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  151. def warehousePath: String

  152. def wholeStageEnabled: Boolean

  153. def wholeStageFallback: Boolean

  154. def wholeStageMaxNumFields: Int

  155. def windowExecBufferInMemoryThreshold: Int

  156. def windowExecBufferSpillThreshold: Int

  157. def writeLegacyParquetFormat: Boolean

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped