abstract class BaseSessionStateBuilder extends AnyRef
Builder class that coordinates construction of a new SessionState.
The builder explicitly defines all components needed by the session state, and creates a session
state when build
is called. Components should only be initialized once. This is not a problem
for most components as they are only used in the build
function. However some components
(conf
, catalog
, functionRegistry
, experimentalMethods
& sqlParser
) are as dependencies
for other components and are shared as a result. These components are defined as lazy vals to
make sure the component is created only once.
A developer can modify the builder by providing custom versions of components, or by using the hooks provided for the analyzer, optimizer & planner. There are some dependencies between the components (they are documented per dependency), a developer should respect these when making modifications in order to prevent initialization problems.
A parent SessionState can be used to initialize the new SessionState. The new session
state will clone the parent sessions state's conf
, functionRegistry
, experimentalMethods
and catalog
fields. Note that the state is cloned when build
is called, and not before.
- Annotations
- @Unstable()
- Alphabetic
- By Inheritance
- BaseSessionStateBuilder
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new BaseSessionStateBuilder(session: SparkSession, parentState: Option[SessionState] = None)
Type Members
- type NewBuilder = (SparkSession, Option[SessionState]) ⇒ BaseSessionStateBuilder
Abstract Value Members
-
abstract
def
newBuilder: NewBuilder
Function that produces a new instance of the
BaseSessionStateBuilder
.Function that produces a new instance of the
BaseSessionStateBuilder
. This is used by the SessionState's clone functionality. Make sure to override this when implementing your own SessionStateBuilder.- Attributes
- protected
Concrete Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
analyzer: Analyzer
Logical query plan analyzer for resolving unresolved attributes and relations.
Logical query plan analyzer for resolving unresolved attributes and relations.
Note: this depends on the
conf
andcatalog
fields.- Attributes
- protected
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
build(): SessionState
Build the SessionState.
-
lazy val
catalog: SessionCatalog
Catalog for managing table and database states.
Catalog for managing table and database states. If there is a pre-existing catalog, the state of that catalog (temp tables & current database) will be copied into the new catalog.
Note: this depends on the
conf
,functionRegistry
andsqlParser
fields.- Attributes
- protected
-
lazy val
catalogManager: CatalogManager
- Attributes
- protected
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
columnarRules: Seq[ColumnarRule]
- Attributes
- protected
-
lazy val
conf: SQLConf
SQL-specific key-value configurations.
SQL-specific key-value configurations.
These either get cloned from a pre-existing instance or newly created. The conf is merged with its SparkConf only when there is no parent session.
- Attributes
- protected
-
def
createClone: (SparkSession, SessionState) ⇒ SessionState
Function used to make clones of the session state.
Function used to make clones of the session state.
- Attributes
- protected
-
def
createQueryExecution: (LogicalPlan) ⇒ QueryExecution
Create a query execution object.
Create a query execution object.
- Attributes
- protected
-
def
customCheckRules: Seq[(LogicalPlan) ⇒ Unit]
Custom check rules to add to the Analyzer.
Custom check rules to add to the Analyzer. Prefer overriding this instead of creating your own Analyzer.
Note that this may NOT depend on the
analyzer
function.- Attributes
- protected
-
def
customEarlyScanPushDownRules: Seq[Rule[LogicalPlan]]
Custom early scan push down rules to add to the Optimizer.
Custom early scan push down rules to add to the Optimizer. Prefer overriding this instead of creating your own Optimizer.
Note that this may NOT depend on the
optimizer
function.- Attributes
- protected
-
def
customOperatorOptimizationRules: Seq[Rule[LogicalPlan]]
Custom operator optimization rules to add to the Optimizer.
Custom operator optimization rules to add to the Optimizer. Prefer overriding this instead of creating your own Optimizer.
Note that this may NOT depend on the
optimizer
function.- Attributes
- protected
-
def
customPlanningStrategies: Seq[Strategy]
Custom strategies to add to the planner.
Custom strategies to add to the planner. Prefer overriding this instead of creating your own Planner.
Note that this may NOT depend on the
planner
function.- Attributes
- protected
-
def
customPostHocResolutionRules: Seq[Rule[LogicalPlan]]
Custom post resolution rules to add to the Analyzer.
Custom post resolution rules to add to the Analyzer. Prefer overriding this instead of creating your own Analyzer.
Note that this may NOT depend on the
analyzer
function.- Attributes
- protected
-
def
customResolutionRules: Seq[Rule[LogicalPlan]]
Custom resolution rules to add to the Analyzer.
Custom resolution rules to add to the Analyzer. Prefer overriding this instead of creating your own Analyzer.
Note that this may NOT depend on the
analyzer
function.- Attributes
- protected
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
lazy val
experimentalMethods: ExperimentalMethods
Experimental methods that can be used to define custom optimization rules and custom planning strategies.
Experimental methods that can be used to define custom optimization rules and custom planning strategies.
This either gets cloned from a pre-existing version or newly created.
- Attributes
- protected
-
def
extensions: SparkSessionExtensions
Session extensions defined in the SparkSession.
Session extensions defined in the SparkSession.
- Attributes
- protected
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
lazy val
functionRegistry: FunctionRegistry
Internal catalog managing functions registered by the user.
Internal catalog managing functions registered by the user.
This either gets cloned from a pre-existing version or cloned from the built-in registry.
- Attributes
- protected
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
listenerManager: ExecutionListenerManager
An interface to register custom org.apache.spark.sql.util.QueryExecutionListeners that listen for execution metrics.
An interface to register custom org.apache.spark.sql.util.QueryExecutionListeners that listen for execution metrics.
This gets cloned from parent if available, otherwise a new instance is created.
- Attributes
- protected
-
def
mergeSparkConf(sqlConf: SQLConf, sparkConf: SparkConf): Unit
Extract entries from
SparkConf
and put them in theSQLConf
Extract entries from
SparkConf
and put them in theSQLConf
- Attributes
- protected
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
optimizer: Optimizer
Logical query plan optimizer.
Logical query plan optimizer.
Note: this depends on
catalog
andexperimentalMethods
fields.- Attributes
- protected
- val parentState: Option[SessionState]
-
def
planner: SparkPlanner
Planner that converts optimized logical plans to physical plans.
Planner that converts optimized logical plans to physical plans.
Note: this depends on the
conf
andexperimentalMethods
fields.- Attributes
- protected
-
def
queryStagePrepRules: Seq[Rule[SparkPlan]]
- Attributes
- protected
-
lazy val
resourceLoader: SessionResourceLoader
ResourceLoader that is used to load function resources and jars.
ResourceLoader that is used to load function resources and jars.
- Attributes
- protected
- val session: SparkSession
-
lazy val
sqlParser: ParserInterface
Parser that extracts expressions, plans, table identifiers etc.
Parser that extracts expressions, plans, table identifiers etc. from SQL texts.
Note: this depends on the
conf
field.- Attributes
- protected
-
def
streamingQueryManager: StreamingQueryManager
Interface to start and stop streaming queries.
Interface to start and stop streaming queries.
- Attributes
- protected
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
udfRegistration: UDFRegistration
Interface exposed to the user for registering user-defined functions.
Interface exposed to the user for registering user-defined functions.
Note 1: The user-defined functions must be deterministic. Note 2: This depends on the
functionRegistry
field.- Attributes
- protected
-
lazy val
v2SessionCatalog: V2SessionCatalog
- Attributes
- protected
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()