Class/Object

org.apache.spark.sql.execution.aggregate

HashAggregateExec

Related Docs: object HashAggregateExec | package aggregate

Permalink

case class HashAggregateExec(requiredChildDistributionExpressions: Option[Seq[Expression]], groupingExpressions: Seq[NamedExpression], aggregateExpressions: Seq[AggregateExpression], aggregateAttributes: Seq[Attribute], initialInputBufferOffset: Int, __resultExpressions: Seq[NamedExpression], child: SparkPlan) extends SparkPlan with UnaryExecNode with CodegenSupport with Product with Serializable

Hash-based aggregate operator that can also fallback to sorting when data exceeds memory size.

Linear Supertypes
CodegenSupport, UnaryExecNode, SparkPlan, Serializable, Serializable, Logging, QueryPlan[SparkPlan], TreeNode[SparkPlan], Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. HashAggregateExec
  2. CodegenSupport
  3. UnaryExecNode
  4. SparkPlan
  5. Serializable
  6. Serializable
  7. Logging
  8. QueryPlan
  9. TreeNode
  10. Product
  11. Equals
  12. AnyRef
  13. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new HashAggregateExec(requiredChildDistributionExpressions: Option[Seq[Expression]], groupingExpressions: Seq[NamedExpression], aggregateExpressions: Seq[AggregateExpression], aggregateAttributes: Seq[Attribute], initialInputBufferOffset: Int, __resultExpressions: Seq[NamedExpression], child: SparkPlan)

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. val __resultExpressions: Seq[NamedExpression]

    Permalink
  5. val aggregateAttributes: Seq[Attribute]

    Permalink
  6. val aggregateExpressions: Seq[AggregateExpression]

    Permalink
  7. lazy val allAttributes: AttributeSeq

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan
  8. def apply(number: Int): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  9. def argString: String

    Permalink
    Definition Classes
    TreeNode
  10. def asCode: String

    Permalink
    Definition Classes
    TreeNode
  11. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  12. lazy val canonicalized: SparkPlan

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  13. val child: SparkPlan

    Permalink
    Definition Classes
    HashAggregateExec → UnaryExecNode
  14. def children: Seq[SparkPlan]

    Permalink
    Definition Classes
    UnaryExecNode → TreeNode
  15. lazy val cleanArgs: Seq[Any]

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  16. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  17. def collect[B](pf: PartialFunction[SparkPlan, B]): Seq[B]

    Permalink
    Definition Classes
    TreeNode
  18. def collectFirst[B](pf: PartialFunction[SparkPlan, B]): Option[B]

    Permalink
    Definition Classes
    TreeNode
  19. lazy val constraints: ExpressionSet

    Permalink
    Definition Classes
    QueryPlan
  20. final def consume(ctx: CodegenContext, outputVars: Seq[ExprCode], row: String = null): String

    Permalink

    Consume the generated columns or row from current SparkPlan, call its parent's doConsume().

    Consume the generated columns or row from current SparkPlan, call its parent's doConsume().

    Definition Classes
    CodegenSupport
  21. lazy val containsChild: Set[TreeNode[_]]

    Permalink
    Definition Classes
    TreeNode
  22. def createHashMap(): UnsafeFixedWidthAggregationMap

    Permalink

    This is called by generated Java class, should be public.

  23. def createUnsafeJoiner(): UnsafeRowJoiner

    Permalink

    This is called by generated Java class, should be public.

  24. def doConsume(ctx: CodegenContext, input: Seq[ExprCode], row: ExprCode): String

    Permalink

    Generate the Java source code to process the rows from child SparkPlan.

    Generate the Java source code to process the rows from child SparkPlan.

    This should be override by subclass to support codegen.

    For example, Filter will generate the code like this:

    # code to evaluate the predicate expression, result is isNull1 and value2 if (isNull1 || !value2) continue; # call consume(), which will call parent.doConsume()

    Note: A plan can either consume the rows as UnsafeRow (row), or a list of variables (input).

    Definition Classes
    HashAggregateExecCodegenSupport
  25. def doExecute(): RDD[InternalRow]

    Permalink

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. Produces the result of the query as an RDD[InternalRow]

    Attributes
    protected
    Definition Classes
    HashAggregateExecSparkPlan
  26. def doExecuteBroadcast[T](): Broadcast[T]

    Permalink

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. Produces the result of the query as a broadcast variable.

    Attributes
    protected[org.apache.spark.sql]
    Definition Classes
    SparkPlan
  27. def doPrepare(): Unit

    Permalink

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. It is guaranteed to run before any execute of SparkPlan. This is helpful if we want to set up some state before executing the query, e.g., BroadcastHashJoin uses it to broadcast asynchronously.

    Note: the prepare method has already walked down the tree, so the implementation doesn't need to call children's prepare methods.

    This will only be called once, protected by this.

    Attributes
    protected
    Definition Classes
    SparkPlan
  28. def doProduce(ctx: CodegenContext): String

    Permalink

    Generate the Java source code to process, should be overridden by subclass to support codegen.

    Generate the Java source code to process, should be overridden by subclass to support codegen.

    doProduce() usually generate the framework, for example, aggregation could generate this:

    if (!initialized) { # create a hash map, then build the aggregation hash map # call child.produce() initialized = true; } while (hashmap.hasNext()) { row = hashmap.next(); # build the aggregation results # create variables for results # call consume(), which will call parent.doConsume() if (shouldStop()) return; }

    Attributes
    protected
    Definition Classes
    HashAggregateExecCodegenSupport
  29. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  30. def evaluateRequiredVariables(attributes: Seq[Attribute], variables: Seq[ExprCode], required: AttributeSet): String

    Permalink

    Returns source code to evaluate the variables for required attributes, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Returns source code to evaluate the variables for required attributes, and clear the code of evaluated variables, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  31. def evaluateVariables(variables: Seq[ExprCode]): String

    Permalink

    Returns source code to evaluate all the variables, and clear the code of them, to prevent them to be evaluated twice.

    Returns source code to evaluate all the variables, and clear the code of them, to prevent them to be evaluated twice.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  32. final def execute(): RDD[InternalRow]

    Permalink

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Concrete implementations of SparkPlan should override doExecute.

    Definition Classes
    SparkPlan
  33. final def executeBroadcast[T](): Broadcast[T]

    Permalink

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Concrete implementations of SparkPlan should override doExecuteBroadcast.

    Definition Classes
    SparkPlan
  34. def executeCollect(): Array[InternalRow]

    Permalink

    Runs this query returning the result as an array.

    Runs this query returning the result as an array.

    Definition Classes
    SparkPlan
  35. def executeCollectPublic(): Array[Row]

    Permalink

    Runs this query returning the result as an array, using external Row format.

    Runs this query returning the result as an array, using external Row format.

    Definition Classes
    SparkPlan
  36. final def executeQuery[T](query: ⇒ T): T

    Permalink

    Execute a query after preparing the query and adding query plan information to created RDDs for visualization.

    Execute a query after preparing the query and adding query plan information to created RDDs for visualization.

    Attributes
    protected
    Definition Classes
    SparkPlan
  37. def executeTake(n: Int): Array[InternalRow]

    Permalink

    Runs this query returning the first n rows as an array.

    Runs this query returning the first n rows as an array.

    This is modeled after RDD.take but never runs any job locally on the driver.

    Definition Classes
    SparkPlan
  38. def executeToIterator(): Iterator[InternalRow]

    Permalink

    Runs this query returning the result as an iterator of InternalRow.

    Runs this query returning the result as an iterator of InternalRow.

    Note: this will trigger multiple jobs (one for each partition).

    Definition Classes
    SparkPlan
  39. final def expressions: Seq[Expression]

    Permalink
    Definition Classes
    QueryPlan
  40. def fastEquals(other: TreeNode[_]): Boolean

    Permalink
    Definition Classes
    TreeNode
  41. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  42. def find(f: (SparkPlan) ⇒ Boolean): Option[SparkPlan]

    Permalink
    Definition Classes
    TreeNode
  43. def finishAggregate(hashMap: UnsafeFixedWidthAggregationMap, sorter: UnsafeKVExternalSorter, peakMemory: SQLMetric, spillSize: SQLMetric): KVIterator[UnsafeRow, UnsafeRow]

    Permalink

    Called by generated Java class to finish the aggregate and return a KVIterator.

  44. def flatMap[A](f: (SparkPlan) ⇒ TraversableOnce[A]): Seq[A]

    Permalink
    Definition Classes
    TreeNode
  45. def foreach(f: (SparkPlan) ⇒ Unit): Unit

    Permalink
    Definition Classes
    TreeNode
  46. def foreachUp(f: (SparkPlan) ⇒ Unit): Unit

    Permalink
    Definition Classes
    TreeNode
  47. def generateTreeString(depth: Int, lastChildren: Seq[Boolean], builder: StringBuilder, verbose: Boolean, prefix: String): StringBuilder

    Permalink
    Definition Classes
    TreeNode
  48. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  49. def getNodeNumbered(number: MutableInt): SparkPlan

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  50. def getRelevantConstraints(constraints: Set[Expression]): Set[Expression]

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  51. val groupingExpressions: Seq[NamedExpression]

    Permalink
  52. def hashCode(): Int

    Permalink
    Definition Classes
    TreeNode → AnyRef → Any
  53. val initialInputBufferOffset: Int

    Permalink
  54. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  55. def innerChildren: Seq[QueryPlan[_]]

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan → TreeNode
  56. def inputRDDs(): Seq[RDD[InternalRow]]

    Permalink

    Returns all the RDDs of InternalRow which generates the input rows.

    Returns all the RDDs of InternalRow which generates the input rows.

    Note: right now we support up to two RDDs.

    Definition Classes
    HashAggregateExecCodegenSupport
  57. def inputSet: AttributeSet

    Permalink
    Definition Classes
    QueryPlan
  58. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  59. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  60. def jsonFields: List[JField]

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  61. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  62. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  63. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  64. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  65. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  66. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  67. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  68. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  69. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  70. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  71. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  72. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  73. def makeCopy(newArgs: Array[AnyRef]): SparkPlan

    Permalink

    Overridden make copy also propagates sqlContext to copied plan.

    Overridden make copy also propagates sqlContext to copied plan.

    Definition Classes
    SparkPlan → TreeNode
  74. def map[A](f: (SparkPlan) ⇒ A): Seq[A]

    Permalink
    Definition Classes
    TreeNode
  75. def mapChildren(f: (SparkPlan) ⇒ SparkPlan): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  76. def mapProductIterator[B](f: (Any) ⇒ B)(implicit arg0: ClassTag[B]): Array[B]

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  77. def metricTerm(ctx: CodegenContext, name: String): String

    Permalink

    Creates a metric using the specified name.

    Creates a metric using the specified name.

    returns

    name of the variable representing the metric

    Definition Classes
    CodegenSupport
  78. def missingInput: AttributeSet

    Permalink
    Definition Classes
    QueryPlan
  79. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  80. def newMutableProjection(expressions: Seq[Expression], inputSchema: Seq[Attribute], useSubexprElimination: Boolean = false): MutableProjection

    Permalink
    Attributes
    protected
    Definition Classes
    SparkPlan
  81. def newNaturalAscendingOrdering(dataTypes: Seq[DataType]): Ordering[InternalRow]

    Permalink

    Creates a row ordering for the given schema, in natural ascending order.

    Creates a row ordering for the given schema, in natural ascending order.

    Attributes
    protected
    Definition Classes
    SparkPlan
  82. def newOrdering(order: Seq[SortOrder], inputSchema: Seq[Attribute]): Ordering[InternalRow]

    Permalink
    Attributes
    protected
    Definition Classes
    SparkPlan
  83. def newPredicate(expression: Expression, inputSchema: Seq[Attribute]): (InternalRow) ⇒ Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    SparkPlan
  84. def nodeName: String

    Permalink
    Definition Classes
    TreeNode
  85. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  86. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  87. def numberedTreeString: String

    Permalink
    Definition Classes
    TreeNode
  88. val origin: Origin

    Permalink
    Definition Classes
    TreeNode
  89. def otherCopyArgs: Seq[AnyRef]

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  90. def output: Seq[Attribute]

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan
  91. def outputOrdering: Seq[SortOrder]

    Permalink

    Specifies how data is ordered in each partition.

    Specifies how data is ordered in each partition.

    Definition Classes
    SparkPlan
  92. def outputPartitioning: Partitioning

    Permalink

    Specifies how data is partitioned across different nodes in the cluster.

    Specifies how data is partitioned across different nodes in the cluster.

    Definition Classes
    UnaryExecNode → SparkPlan
  93. def outputSet: AttributeSet

    Permalink
    Definition Classes
    QueryPlan
  94. var parent: CodegenSupport

    Permalink

    Which SparkPlan is calling produce() of this one.

    Which SparkPlan is calling produce() of this one. It's itself for the first SparkPlan.

    Attributes
    protected
    Definition Classes
    CodegenSupport
  95. final def prepare(): Unit

    Permalink

    Prepare a SparkPlan for execution.

    Prepare a SparkPlan for execution. It's idempotent.

    Definition Classes
    SparkPlan
  96. def prepareSubqueries(): Unit

    Permalink

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Finds scalar subquery expressions in this plan node and starts evaluating them. The list of subqueries are added to subqueryResults.

    Attributes
    protected
    Definition Classes
    SparkPlan
  97. def prettyJson: String

    Permalink
    Definition Classes
    TreeNode
  98. def printSchema(): Unit

    Permalink
    Definition Classes
    QueryPlan
  99. final def produce(ctx: CodegenContext, parent: CodegenSupport): String

    Permalink

    Returns Java source code to process the rows from input RDD.

    Returns Java source code to process the rows from input RDD.

    Definition Classes
    CodegenSupport
  100. def producedAttributes: AttributeSet

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan
  101. def references: AttributeSet

    Permalink
    Definition Classes
    QueryPlan
  102. def requiredChildDistribution: List[Distribution]

    Permalink

    Specifies any partition requirements on the input data for this operator.

    Specifies any partition requirements on the input data for this operator.

    Definition Classes
    HashAggregateExecSparkPlan
  103. val requiredChildDistributionExpressions: Option[Seq[Expression]]

    Permalink
  104. def requiredChildOrdering: Seq[Seq[SortOrder]]

    Permalink

    Specifies sort order for each partition requirements on the input data for this operator.

    Specifies sort order for each partition requirements on the input data for this operator.

    Definition Classes
    SparkPlan
  105. lazy val resultExpressions: Seq[NamedExpression]

    Permalink
  106. def sameResult(plan: SparkPlan): Boolean

    Permalink
    Definition Classes
    QueryPlan
  107. lazy val schema: StructType

    Permalink
    Definition Classes
    QueryPlan
  108. def schemaString: String

    Permalink
    Definition Classes
    QueryPlan
  109. def simpleString: String

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan → TreeNode
  110. def sparkContext: SparkContext

    Permalink
    Attributes
    protected
    Definition Classes
    SparkPlan
  111. final val sqlContext: SQLContext

    Permalink

    A handle to the SQL Context that was used to create this plan.

    A handle to the SQL Context that was used to create this plan. Since many operators need access to the sqlContext for RDD operations or configuration this field is automatically populated by the query planning infrastructure.

    Definition Classes
    SparkPlan
  112. def statePrefix: String

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  113. def stringArgs: Iterator[Any]

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  114. val subexpressionEliminationEnabled: Boolean

    Permalink
    Definition Classes
    SparkPlan
  115. def subqueries: Seq[SparkPlan]

    Permalink
    Definition Classes
    QueryPlan
  116. def supportCodegen: Boolean

    Permalink

    Whether this SparkPlan support whole stage codegen or not.

    Whether this SparkPlan support whole stage codegen or not.

    Definition Classes
    HashAggregateExecCodegenSupport
  117. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  118. def toJSON: String

    Permalink
    Definition Classes
    TreeNode
  119. def toString(): String

    Permalink
    Definition Classes
    TreeNode → AnyRef → Any
  120. def transform(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  121. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  122. def transformChildren(rule: PartialFunction[SparkPlan, SparkPlan], nextOperation: (SparkPlan, PartialFunction[SparkPlan, SparkPlan]) ⇒ SparkPlan): SparkPlan

    Permalink
    Attributes
    protected
    Definition Classes
    TreeNode
  123. def transformDown(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  124. def transformExpressions(rule: PartialFunction[Expression, Expression]): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  125. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  126. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): HashAggregateExec.this.type

    Permalink
    Definition Classes
    QueryPlan
  127. def transformUp(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan

    Permalink
    Definition Classes
    TreeNode
  128. def treeString(verbose: Boolean): String

    Permalink
    Definition Classes
    TreeNode
  129. def treeString: String

    Permalink
    Definition Classes
    TreeNode
  130. def usedInputs: AttributeSet

    Permalink

    The subset of inputSet those should be evaluated before this plan.

    The subset of inputSet those should be evaluated before this plan.

    We will use this to insert some code to access those columns that are actually used by current plan before calling doConsume().

    Definition Classes
    HashAggregateExecCodegenSupport
  131. def validConstraints: Set[Expression]

    Permalink
    Attributes
    protected
    Definition Classes
    QueryPlan
  132. def verboseString: String

    Permalink
    Definition Classes
    HashAggregateExec → QueryPlan → TreeNode
  133. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  134. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  135. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  136. def waitForSubqueries(): Unit

    Permalink

    Blocks the thread until all subqueries finish evaluation and update the results.

    Blocks the thread until all subqueries finish evaluation and update the results.

    Attributes
    protected
    Definition Classes
    SparkPlan
  137. def withNewChildren(newChildren: Seq[SparkPlan]): SparkPlan

    Permalink
    Definition Classes
    TreeNode

Inherited from CodegenSupport

Inherited from UnaryExecNode

Inherited from SparkPlan

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from QueryPlan[SparkPlan]

Inherited from TreeNode[SparkPlan]

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped