Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark
    Definition Classes
    apache
  • package sql

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Definition Classes
    spark
  • package execution

    The physical execution component of Spark SQL.

    The physical execution component of Spark SQL. Note that this is a private package. All classes in catalyst are considered an internal API to Spark SQL and are subject to change between minor releases.

    Definition Classes
    sql
  • package debug

    Contains methods for debugging query execution.

    Contains methods for debugging query execution.

    Usage:

    import org.apache.spark.sql.execution.debug._
    sql("SELECT 1").debug()
    sql("SELECT 1").debugCodegen()

    or for streaming case (structured streaming):

    import org.apache.spark.sql.execution.debug._
    val query = df.writeStream.<...>.start()
    query.debugCodegen()

    Note that debug in structured streaming is not supported, because it doesn't make sense for streaming to execute batch once while main query is running concurrently.

    Definition Classes
    execution
  • DebugExec
  • DebugQuery
  • DebugStreamQuery

case class DebugExec(child: SparkPlan) extends SparkPlan with UnaryExecNode with CodegenSupport with Product with Serializable

Linear Supertypes
CodegenSupport, UnaryExecNode, UnaryLike[SparkPlan], SparkPlan, Serializable, Logging, QueryPlan[SparkPlan], SQLConfHelper, TreeNode[SparkPlan], TreePatternBits, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. DebugExec
  2. CodegenSupport
  3. UnaryExecNode
  4. UnaryLike
  5. SparkPlan
  6. Serializable
  7. Logging
  8. QueryPlan
  9. SQLConfHelper
  10. TreeNode
  11. TreePatternBits
  12. Product
  13. Equals
  14. AnyRef
  15. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new DebugExec(child: SparkPlan)

Type Members

  1. case class ColumnMetrics() extends Product with Serializable

    A collection of metrics for each column of output.

  2. class SetAccumulator[T] extends AccumulatorV2[T, Set[T]]

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. lazy val allAttributes: AttributeSeq
    Definition Classes
    QueryPlan
  5. def apply(number: Int): TreeNode[_]
    Definition Classes
    TreeNode
  6. def argString(maxFields: Int): String
    Definition Classes
    TreeNode
  7. def asCode: String
    Definition Classes
    TreeNode
  8. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  9. def canCheckLimitNotReached: Boolean

    Check if the node is supposed to produce limit not reached checks.

    Check if the node is supposed to produce limit not reached checks.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  10. final lazy val canonicalized: SparkPlan
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  11. val child: SparkPlan
    Definition Classes
    DebugExec → UnaryLike
  12. final lazy val children: Seq[SparkPlan]
    Definition Classes
    UnaryLike
    Annotations
    @transient()
  13. def cleanupResources(): Unit

    Cleans up the resources used by the physical operator (if any).

    Cleans up the resources used by the physical operator (if any). In general, all the resources should be cleaned up when the task finishes but operators like SortMergeJoinExec and LimitExec may want eager cleanup to free up tight resources (e.g., memory).

    Attributes
    protected[sql]
    Definition Classes
    SparkPlan
  14. def clone(): SparkPlan
    Definition Classes
    TreeNode → AnyRef
  15. def collect[B](pf: PartialFunction[SparkPlan, B]): Seq[B]
    Definition Classes
    TreeNode
  16. def collectFirst[B](pf: PartialFunction[SparkPlan, B]): Option[B]
    Definition Classes
    TreeNode
  17. def collectLeaves(): Seq[SparkPlan]
    Definition Classes
    TreeNode
  18. def collectWithSubqueries[B](f: PartialFunction[SparkPlan, B]): Seq[B]
    Definition Classes
    QueryPlan
  19. val columnStats: Array[ColumnMetrics]
  20. def conf: SQLConf
    Definition Classes
    SparkPlan → SQLConfHelper
  21. final def consume(ctx: CodegenContext, outputVars: Seq[ExprCode], row: String = null): String

    Consume the generated columns or row from current SparkPlan, call its parent's doConsume().

    Consume the generated columns or row from current SparkPlan, call its parent's doConsume().

    Note that outputVars and row can't both be null.

    Definition Classes
    CodegenSupport
  22. final def containsAllPatterns(patterns: TreePattern*): Boolean
    Definition Classes
    TreePatternBits
  23. final def containsAnyPattern(patterns: TreePattern*): Boolean
    Definition Classes
    TreePatternBits
  24. lazy val containsChild: Set[TreeNode[_]]
    Definition Classes
    TreeNode
  25. final def containsPattern(t: TreePattern): Boolean
    Definition Classes
    TreePatternBits
    Annotations
    @inline()
  26. def copyTagsFrom(other: SparkPlan): Unit
    Definition Classes
    TreeNode
  27. lazy val deterministic: Boolean
    Definition Classes
    QueryPlan
  28. def doCanonicalize(): SparkPlan
    Attributes
    protected
    Definition Classes
    QueryPlan
  29. def doConsume(ctx: CodegenContext, input: Seq[ExprCode], row: ExprCode): String

    Generate the Java source code to process the rows from child SparkPlan.

    Generate the Java source code to process the rows from child SparkPlan. This should only be called from consume.

    This should be override by subclass to support codegen.

    Note: The operator should not assume the existence of an outer processing loop, which it can jump from with "continue;"!

    For example, filter could generate this: # code to evaluate the predicate expression, result is isNull1 and value2 if (!isNull1 && value2) { # call consume(), which will call parent.doConsume() }

    Note: A plan can either consume the rows as UnsafeRow (row), or a list of variables (input). When consuming as a listing of variables, the code to produce the input is already generated and CodegenContext.currentVars is already set. When consuming as UnsafeRow, implementations need to put row.code in the generated code and set CodegenContext.INPUT_ROW manually. Some plans may need more tweaks as they have different inputs(join build side, aggregate buffer, etc.), or other special cases.

    Definition Classes
    DebugExecCodegenSupport
  30. def doExecute(): RDD[InternalRow]

    Produces the result of the query as an RDD[InternalRow]

    Produces the result of the query as an RDD[InternalRow]

    Overridden by concrete implementations of SparkPlan.

    Attributes
    protected
    Definition Classes
    DebugExecSparkPlan
  31. def doExecuteBroadcast[T](): Broadcast[T]

    Produces the result of the query as a broadcast variable.

    Produces the result of the query as a broadcast variable.

    Overridden by concrete implementations of SparkPlan.

    Definition Classes
    DebugExecSparkPlan
  32. def doExecuteColumnar(): RDD[ColumnarBatch]

    Produces the result of the query as an RDD[ColumnarBatch] if supportsColumnar returns true.

    Produces the result of the query as an RDD[ColumnarBatch] if supportsColumnar returns true. By convention the executor that creates a ColumnarBatch is responsible for closing it when it is no longer needed. This allows input formats to be able to reuse batches if needed.

    Definition Classes
    DebugExecSparkPlan
  33. def doPrepare(): Unit

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. It is guaranteed to run before any execute of SparkPlan. This is helpful if we want to set up some state before executing the query, e.g., BroadcastHashJoin uses it to broadcast asynchronously.

    Attributes
    protected
    Definition Classes
    SparkPlan
    Note

    prepare method has already walked down the tree, so the implementation doesn't have to call children's prepare methods. This will only be called once, protected by this.

  34. def doProduce(ctx: CodegenContext): String

    Generate the Java source code to process, should be overridden by subclass to support codegen.

    Generate the Java source code to process, should be overridden by subclass to support codegen.

    doProduce() usually generate the framework, for example, aggregation could generate this:

    if (!initialized) { # create a hash map, then build the aggregation hash map # call child.produce() initialized = true; } while (hashmap.hasNext()) { row = hashmap.next(); # build the aggregation results # create variables for results # call consume(), which will call parent.doConsume() if (shouldStop()) return; }

    Definition Classes
    DebugExecCodegenSupport
  35. def dumpStats(): Unit
  36. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  37. def evaluateNondeterministicVariables(attributes: Seq[Attribute], variables: Seq[ExprCode], expressions: Seq[NamedExpression]): String

    Returns source code to evaluate the variables for non-deterministic expressions, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Returns source code to evaluate the variables for non-deterministic expressions, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  38. def evaluateRequiredVariables(attributes: Seq[Attribute], variables: Seq[ExprCode], required: AttributeSet): String

    Returns source code to evaluate the variables for required attributes, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Returns source code to evaluate the variables for required attributes, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  39. def evaluateVariables(variables: Seq[ExprCode]): String

    Returns source code to evaluate all the variables, and clear the code of them, to prevent them to be evaluated twice.

    Returns source code to evaluate all the variables, and clear the code of them, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  40. final def execute(): RDD[InternalRow]

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Concrete implementations of SparkPlan should override doExecute.

    Definition Classes
    SparkPlan
  41. final def executeBroadcast[T](): Broadcast[T]

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Concrete implementations of SparkPlan should override doExecuteBroadcast.

    Definition Classes
    SparkPlan
  42. def executeCollect(): Array[InternalRow]

    Runs this query returning the result as an array.

    Runs this query returning the result as an array.

    Definition Classes
    SparkPlan
  43. def executeCollectPublic(): Array[Row]

    Runs this query returning the result as an array, using external Row format.

    Runs this query returning the result as an array, using external Row format.

    Definition Classes
    SparkPlan
  44. final def executeColumnar(): RDD[ColumnarBatch]

    Returns the result of this query as an RDD[ColumnarBatch] by delegating to doColumnarExecute after preparations.

    Returns the result of this query as an RDD[ColumnarBatch] by delegating to doColumnarExecute after preparations.

    Concrete implementations of SparkPlan should override doColumnarExecute if supportsColumnar returns true.

    Definition Classes
    SparkPlan
  45. final def executeQuery[T](query: => T): T

    Executes a query after preparing the query and adding query plan information to created RDDs for visualization.

    Executes a query after preparing the query and adding query plan information to created RDDs for visualization.

    Attributes
    protected
    Definition Classes
    SparkPlan
  46. def executeTail(n: Int): Array[InternalRow]

    Runs this query returning the last n rows as an array.

    Runs this query returning the last n rows as an array.

    This is modeled after RDD.take but never runs any job locally on the driver.

    Definition Classes
    SparkPlan
  47. def executeTake(n: Int): Array[InternalRow]

    Runs this query returning the first n rows as an array.

    Runs this query returning the first n rows as an array.

    This is modeled after RDD.take but never runs any job locally on the driver.

    Definition Classes
    SparkPlan
  48. def executeToIterator(): Iterator[InternalRow]

    Runs this query returning the result as an iterator of InternalRow.

    Runs this query returning the result as an iterator of InternalRow.

    Definition Classes
    SparkPlan
    Note

    Triggers multiple jobs (one for each partition).

  49. def exists(f: (SparkPlan) => Boolean): Boolean
    Definition Classes
    TreeNode
  50. final def expressions: Seq[Expression]
    Definition Classes
    QueryPlan
  51. def fastEquals(other: TreeNode[_]): Boolean
    Definition Classes
    TreeNode
  52. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  53. def find(f: (SparkPlan) => Boolean): Option[SparkPlan]
    Definition Classes
    TreeNode
  54. def flatMap[A](f: (SparkPlan) => TraversableOnce[A]): Seq[A]
    Definition Classes
    TreeNode
  55. def foreach(f: (SparkPlan) => Unit): Unit
    Definition Classes
    TreeNode
  56. def foreachUp(f: (SparkPlan) => Unit): Unit
    Definition Classes
    TreeNode
  57. def formattedNodeName: String
    Attributes
    protected
    Definition Classes
    QueryPlan
  58. def generateTreeString(depth: Int, lastChildren: Seq[Boolean], append: (String) => Unit, verbose: Boolean, prefix: String, addSuffix: Boolean, maxFields: Int, printNodeId: Boolean, indent: Int): Unit
    Definition Classes
    TreeNode
  59. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  60. def getDefaultTreePatternBits: BitSet
    Attributes
    protected
    Definition Classes
    TreeNode
  61. def getTagValue[T](tag: TreeNodeTag[T]): Option[T]
    Definition Classes
    TreeNode
  62. def hashCode(): Int
    Definition Classes
    TreeNode → AnyRef → Any
  63. val id: Int
    Definition Classes
    SparkPlan
  64. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  65. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  66. def innerChildren: Seq[QueryPlan[_]]
    Definition Classes
    QueryPlan → TreeNode
  67. def inputRDDs(): Seq[RDD[InternalRow]]

    Returns all the RDDs of InternalRow which generates the input rows.

    Returns all the RDDs of InternalRow which generates the input rows.

    Definition Classes
    DebugExecCodegenSupport
    Note

    Right now we support up to two RDDs

  68. def inputSet: AttributeSet
    Definition Classes
    QueryPlan
  69. def isCanonicalizedPlan: Boolean
    Attributes
    protected
    Definition Classes
    QueryPlan
  70. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  71. def isRuleIneffective(ruleId: RuleId): Boolean
    Attributes
    protected
    Definition Classes
    TreeNode
  72. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  73. def jsonFields: List[JField]
    Attributes
    protected
    Definition Classes
    TreeNode
  74. final def legacyWithNewChildren(newChildren: Seq[SparkPlan]): SparkPlan
    Attributes
    protected
    Definition Classes
    TreeNode
  75. def limitNotReachedChecks: Seq[String]

    A sequence of checks which evaluate to true if the downstream Limit operators have not received enough records and reached the limit.

    A sequence of checks which evaluate to true if the downstream Limit operators have not received enough records and reached the limit. If current node is a data producing node, it can leverage this information to stop producing data and complete the data flow earlier. Common data producing nodes are leaf nodes like Range and Scan, and blocking nodes like Sort and Aggregate. These checks should be put into the loop condition of the data producing loop.

    Definition Classes
    CodegenSupport
  76. final def limitNotReachedCond: String

    A helper method to generate the data producing loop condition according to the limit-not-reached checks.

    A helper method to generate the data producing loop condition according to the limit-not-reached checks.

    Definition Classes
    CodegenSupport
  77. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  78. def logDebug(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  79. def logDebug(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  80. def logError(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  81. def logError(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  82. def logInfo(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  83. def logInfo(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  84. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  85. def logTrace(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  86. def logTrace(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  87. def logWarning(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  88. def logWarning(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  89. def logicalLink: Option[LogicalPlan]

    returns

    The logical plan this plan is linked to.

    Definition Classes
    SparkPlan
  90. def longMetric(name: String): SQLMetric

    returns

    SQLMetric for the name.

    Definition Classes
    SparkPlan
  91. def makeCopy(newArgs: Array[AnyRef]): SparkPlan

    Overridden make copy also propagates sqlContext to copied plan.

    Overridden make copy also propagates sqlContext to copied plan.

    Definition Classes
    SparkPlan → TreeNode
  92. def map[A](f: (SparkPlan) => A): Seq[A]
    Definition Classes
    TreeNode
  93. final def mapChildren(f: (SparkPlan) => SparkPlan): SparkPlan
    Definition Classes
    UnaryLike
  94. def mapExpressions(f: (Expression) => Expression): DebugExec.this.type
    Definition Classes
    QueryPlan
  95. def mapProductIterator[B](f: (Any) => B)(implicit arg0: ClassTag[B]): Array[B]
    Attributes
    protected
    Definition Classes
    TreeNode
  96. def markRuleAsIneffective(ruleId: RuleId): Unit
    Attributes
    protected
    Definition Classes
    TreeNode
  97. def metricTerm(ctx: CodegenContext, name: String): String

    Creates a metric using the specified name.

    Creates a metric using the specified name.

    returns

    name of the variable representing the metric

    Definition Classes
    CodegenSupport
  98. def metrics: Map[String, SQLMetric]

    returns

    All metrics containing metrics of this SparkPlan.

    Definition Classes
    SparkPlan
  99. final def missingInput: AttributeSet
    Definition Classes
    QueryPlan
  100. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  101. def needCopyResult: Boolean

    Whether or not the result rows of this operator should be copied before putting into a buffer.

    Whether or not the result rows of this operator should be copied before putting into a buffer.

    If any operator inside WholeStageCodegen generate multiple rows from a single row (for example, Join), this should be true.

    If an operator starts a new pipeline, this should be false.

    Definition Classes
    CodegenSupport
  102. def needStopCheck: Boolean

    Whether or not the children of this operator should generate a stop check when consuming input rows.

    Whether or not the children of this operator should generate a stop check when consuming input rows. This is used to suppress shouldStop() in a loop of WholeStageCodegen.

    This should be false if an operator starts a new pipeline, which means it consumes all rows produced by children but doesn't output row to buffer by calling append(), so the children don't require shouldStop() in the loop of producing rows.

    Definition Classes
    CodegenSupport
  103. def nodeName: String
    Definition Classes
    TreeNode
  104. val nodePatterns: Seq[TreePattern]
    Attributes
    protected
    Definition Classes
    TreeNode
  105. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  106. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  107. val numColumns: Int
  108. def numberedTreeString: String
    Definition Classes
    TreeNode
  109. val origin: Origin
    Definition Classes
    TreeNode
  110. def otherCopyArgs: Seq[AnyRef]
    Attributes
    protected
    Definition Classes
    TreeNode
  111. def output: Seq[Attribute]
    Definition Classes
    DebugExec → QueryPlan
  112. def outputOrdering: Seq[SortOrder]

    Specifies how data is ordered in each partition.

    Specifies how data is ordered in each partition.

    Definition Classes
    SparkPlan
  113. def outputPartitioning: Partitioning

    Specifies how data is partitioned across different nodes in the cluster.

    Specifies how data is partitioned across different nodes in the cluster. Note this method may fail if it is invoked before EnsureRequirements is applied since PartitioningCollection requires all its partitionings to have the same number of partitions.

    Definition Classes
    DebugExecSparkPlan
  114. lazy val outputSet: AttributeSet
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  115. def p(number: Int): SparkPlan
    Definition Classes
    TreeNode
  116. val parent: CodegenSupport

    Which SparkPlan is calling produce() of this one.

    Which SparkPlan is calling produce() of this one. It's itself for the first SparkPlan.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  117. final def prepare(): Unit

    Prepares this SparkPlan for execution.

    Prepares this SparkPlan for execution. It's idempotent.

    Definition Classes
    SparkPlan
  118. def prepareSubqueries(): Unit

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Attributes
    protected
    Definition Classes
    SparkPlan
  119. def prettyJson: String
    Definition Classes
    TreeNode
  120. def printSchema(): Unit
    Definition Classes
    QueryPlan
  121. final def produce(ctx: CodegenContext, parent: CodegenSupport): String

    Returns Java source code to process the rows from input RDD.

    Returns Java source code to process the rows from input RDD.

    Definition Classes
    CodegenSupport
  122. def producedAttributes: AttributeSet
    Definition Classes
    QueryPlan
  123. def productElementNames: Iterator[String]
    Definition Classes
    Product
  124. lazy val references: AttributeSet
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  125. def requiredChildDistribution: Seq[Distribution]

    Specifies the data distribution requirements of all the children for this operator.

    Specifies the data distribution requirements of all the children for this operator. By default it's UnspecifiedDistribution for each child, which means each child can have any distribution.

    If an operator overwrites this method, and specifies distribution requirements(excluding UnspecifiedDistribution and BroadcastDistribution) for more than one child, Spark guarantees that the outputs of these children will have same number of partitions, so that the operator can safely zip partitions of these children's result RDDs. Some operators can leverage this guarantee to satisfy some interesting requirement, e.g., non-broadcast joins can specify HashClusteredDistribution(a,b) for its left child, and specify HashClusteredDistribution(c,d) for its right child, then it's guaranteed that left and right child are co-partitioned by a,b/c,d, which means tuples of same value are in the partitions of same index, e.g., (a=1,b=2) and (c=1,d=2) are both in the second partition of left and right child.

    Definition Classes
    SparkPlan
  126. def requiredChildOrdering: Seq[Seq[SortOrder]]

    Specifies sort order for each partition requirements on the input data for this operator.

    Specifies sort order for each partition requirements on the input data for this operator.

    Definition Classes
    SparkPlan
  127. def resetMetrics(): Unit

    Resets all the metrics.

    Resets all the metrics.

    Definition Classes
    SparkPlan
  128. def rewriteAttrs(attrMap: AttributeMap[Attribute]): SparkPlan
    Definition Classes
    QueryPlan
  129. final def sameResult(other: SparkPlan): Boolean
    Definition Classes
    QueryPlan
  130. lazy val schema: StructType
    Definition Classes
    QueryPlan
  131. def schemaString: String
    Definition Classes
    QueryPlan
  132. final def semanticHash(): Int
    Definition Classes
    QueryPlan
  133. final val session: SparkSession
    Definition Classes
    SparkPlan
  134. def setLogicalLink(logicalPlan: LogicalPlan): Unit

    Set logical plan link recursively if unset.

    Set logical plan link recursively if unset.

    Definition Classes
    SparkPlan
  135. def setTagValue[T](tag: TreeNodeTag[T], value: T): Unit
    Definition Classes
    TreeNode
  136. def shouldStopCheckCode: String

    Helper default should stop check code.

    Helper default should stop check code.

    Definition Classes
    CodegenSupport
  137. def simpleString(maxFields: Int): String
    Definition Classes
    QueryPlan → TreeNode
  138. def simpleStringWithNodeId(): String
    Definition Classes
    QueryPlan → TreeNode
  139. def sparkContext: SparkContext
    Attributes
    protected
    Definition Classes
    SparkPlan
  140. def statePrefix: String
    Attributes
    protected
    Definition Classes
    QueryPlan
  141. def stringArgs: Iterator[Any]
    Attributes
    protected
    Definition Classes
    TreeNode
  142. lazy val subqueries: Seq[SparkPlan]
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  143. def subqueriesAll: Seq[SparkPlan]
    Definition Classes
    QueryPlan
  144. def supportCodegen: Boolean

    Whether this SparkPlan supports whole stage codegen or not.

    Whether this SparkPlan supports whole stage codegen or not.

    Definition Classes
    CodegenSupport
  145. def supportsColumnar: Boolean

    Return true if this stage of the plan supports columnar execution.

    Return true if this stage of the plan supports columnar execution. A plan can also support row-based execution (see supportsRowBased). Spark will decide which execution to be called during query planning.

    Definition Classes
    DebugExecSparkPlan
  146. def supportsRowBased: Boolean

    Return true if this stage of the plan supports row-based execution.

    Return true if this stage of the plan supports row-based execution. A plan can also support columnar execution (see supportsColumnar). Spark will decide which execution to be called during query planning.

    Definition Classes
    SparkPlan
  147. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  148. def toJSON: String
    Definition Classes
    TreeNode
  149. def toRowBased: SparkPlan

    Converts the output of this plan to row-based if it is columnar plan.

    Converts the output of this plan to row-based if it is columnar plan.

    Definition Classes
    SparkPlan
  150. def toString(): String
    Definition Classes
    TreeNode → AnyRef → Any
  151. def transform(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  152. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): DebugExec.this.type
    Definition Classes
    QueryPlan
  153. def transformAllExpressionsWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): DebugExec.this.type
    Definition Classes
    QueryPlan
  154. def transformDown(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  155. def transformDownWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  156. def transformDownWithSubqueries(f: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    QueryPlan
  157. def transformDownWithSubqueriesAndPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(f: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    QueryPlan
  158. def transformExpressions(rule: PartialFunction[Expression, Expression]): DebugExec.this.type
    Definition Classes
    QueryPlan
  159. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): DebugExec.this.type
    Definition Classes
    QueryPlan
  160. def transformExpressionsDownWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): DebugExec.this.type
    Definition Classes
    QueryPlan
  161. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): DebugExec.this.type
    Definition Classes
    QueryPlan
  162. def transformExpressionsUpWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): DebugExec.this.type
    Definition Classes
    QueryPlan
  163. def transformExpressionsWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): DebugExec.this.type
    Definition Classes
    QueryPlan
  164. def transformUp(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  165. def transformUpWithBeforeAndAfterRuleOnChildren(cond: (SparkPlan) => Boolean, ruleId: RuleId)(rule: PartialFunction[(SparkPlan, SparkPlan), SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  166. def transformUpWithNewOutput(rule: PartialFunction[SparkPlan, (SparkPlan, Seq[(Attribute, Attribute)])], skipCond: (SparkPlan) => Boolean, canGetOutput: (SparkPlan) => Boolean): SparkPlan
    Definition Classes
    QueryPlan
  167. def transformUpWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  168. def transformUpWithSubqueries(f: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    QueryPlan
  169. def transformWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  170. def transformWithSubqueries(f: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    QueryPlan
  171. lazy val treePatternBits: BitSet
    Definition Classes
    QueryPlan → TreeNode → TreePatternBits
  172. def treeString(append: (String) => Unit, verbose: Boolean, addSuffix: Boolean, maxFields: Int, printOperatorId: Boolean): Unit
    Definition Classes
    TreeNode
  173. final def treeString(verbose: Boolean, addSuffix: Boolean, maxFields: Int, printOperatorId: Boolean): String
    Definition Classes
    TreeNode
  174. final def treeString: String
    Definition Classes
    TreeNode
  175. val tupleCount: LongAccumulator
  176. def unsetTagValue[T](tag: TreeNodeTag[T]): Unit
    Definition Classes
    TreeNode
  177. def updateOuterReferencesInSubquery(plan: SparkPlan, attrMap: AttributeMap[Attribute]): SparkPlan
    Attributes
    protected
    Definition Classes
    QueryPlan
  178. def usedInputs: AttributeSet

    The subset of inputSet those should be evaluated before this plan.

    The subset of inputSet those should be evaluated before this plan.

    We will use this to insert some code to access those columns that are actually used by current plan before calling doConsume().

    Definition Classes
    CodegenSupport
  179. def vectorTypes: Option[Seq[String]]

    The exact java types of the columns that are output in columnar processing mode.

    The exact java types of the columns that are output in columnar processing mode. This is a performance optimization for code generation and is optional.

    Definition Classes
    SparkPlan
  180. def verboseString(maxFields: Int): String
    Definition Classes
    QueryPlan → TreeNode
  181. def verboseStringWithOperatorId(): String
    Definition Classes
    UnaryExecNode → QueryPlan
  182. def verboseStringWithSuffix(maxFields: Int): String
    Definition Classes
    TreeNode
  183. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  184. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  185. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  186. def waitForSubqueries(): Unit

    Blocks the thread until all subqueries finish evaluation and update the results.

    Blocks the thread until all subqueries finish evaluation and update the results.

    Attributes
    protected
    Definition Classes
    SparkPlan
  187. def withNewChildInternal(newChild: SparkPlan): DebugExec
    Attributes
    protected
    Definition Classes
    DebugExec → UnaryLike
  188. final def withNewChildren(newChildren: Seq[SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  189. final def withNewChildrenInternal(newChildren: IndexedSeq[SparkPlan]): SparkPlan
    Definition Classes
    UnaryLike

Inherited from CodegenSupport

Inherited from UnaryExecNode

Inherited from UnaryLike[SparkPlan]

Inherited from SparkPlan

Inherited from Serializable

Inherited from Logging

Inherited from QueryPlan[SparkPlan]

Inherited from SQLConfHelper

Inherited from TreeNode[SparkPlan]

Inherited from TreePatternBits

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped