class SparkOptimizer extends Optimizer
- Alphabetic
- By Inheritance
- SparkOptimizer
- Optimizer
- RuleExecutor
- Logging
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new SparkOptimizer(catalogManager: CatalogManager, catalog: SessionCatalog, experimentalMethods: ExperimentalMethods)
Type Members
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- final def batches: Seq[Batch]
- Definition Classes
- Optimizer → RuleExecutor
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- def defaultBatches: Seq[Batch]
- Definition Classes
- SparkOptimizer → Optimizer
- def earlyScanPushDownRules: Seq[Rule[LogicalPlan]]
- Definition Classes
- SparkOptimizer → Optimizer
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- val excludedOnceBatches: Set[String]
- Attributes
- protected
- Definition Classes
- Optimizer → RuleExecutor
- def execute(plan: LogicalPlan): LogicalPlan
- Definition Classes
- RuleExecutor
- def executeAndTrack(plan: LogicalPlan, tracker: QueryPlanningTracker): LogicalPlan
- Definition Classes
- RuleExecutor
- def extendedOperatorOptimizationRules: Seq[Rule[LogicalPlan]]
- Definition Classes
- Optimizer
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- def fixedPoint: FixedPoint
- Attributes
- protected
- Definition Classes
- Optimizer
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
- Attributes
- protected
- Definition Classes
- Logging
- def initializeLogIfNecessary(isInterpreter: Boolean): Unit
- Attributes
- protected
- Definition Classes
- Logging
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- def isPlanIntegral(previousPlan: LogicalPlan, currentPlan: LogicalPlan): Boolean
- Attributes
- protected
- Definition Classes
- Optimizer → RuleExecutor
- def isTraceEnabled(): Boolean
- Attributes
- protected
- Definition Classes
- Logging
- def log: Logger
- Attributes
- protected
- Definition Classes
- Logging
- def logDebug(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logDebug(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logError(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logError(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logInfo(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logInfo(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logName: String
- Attributes
- protected
- Definition Classes
- Logging
- def logTrace(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logTrace(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logWarning(msg: => String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
- def logWarning(msg: => String): Unit
- Attributes
- protected
- Definition Classes
- Logging
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def nonExcludableRules: Seq[String]
- Definition Classes
- SparkOptimizer → Optimizer
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def postHocOptimizationBatches: Seq[Batch]
Optimization batches that are executed after the regular optimization batches, but before the batch executing the ExperimentalMethods optimizer rules.
Optimization batches that are executed after the regular optimization batches, but before the batch executing the ExperimentalMethods optimizer rules. This hook can be used to add custom optimizer batches to the Spark optimizer.
Note that 'Extract Python UDFs' batch is an exception and ran after the batches defined here.
- def preCBORules: Seq[Rule[LogicalPlan]]
- Definition Classes
- SparkOptimizer → Optimizer
- def preOptimizationBatches: Seq[Batch]
Optimization batches that are executed before the regular optimization batches (also before the finish analysis batch).
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- case object Once extends Strategy with Product with Serializable
- Definition Classes
- RuleExecutor