Packages

c

org.apache.spark.sql.execution.python

FlatMapGroupsInPandasWithStateExec

case class FlatMapGroupsInPandasWithStateExec(functionExpr: Expression, groupingAttributes: Seq[Attribute], outAttributes: Seq[Attribute], stateType: StructType, stateInfo: Option[StatefulOperatorStateInfo], stateFormatVersion: Int, outputMode: OutputMode, timeoutConf: GroupStateTimeout, batchTimestampMs: Option[Long], eventTimeWatermarkForLateEvents: Option[Long], eventTimeWatermarkForEviction: Option[Long], child: SparkPlan) extends SparkPlan with UnaryExecNode with FlatMapGroupsWithStateExecBase with Product with Serializable

Physical operator for executing org.apache.spark.sql.catalyst.plans.logical.FlatMapGroupsInPandasWithState

functionExpr

function called on each group

groupingAttributes

used to group the data

outAttributes

used to define the output rows

stateType

used to serialize/deserialize state before calling functionExpr

stateInfo

StatefulOperatorStateInfo to identify the state store for a given operator.

stateFormatVersion

the version of state format.

outputMode

the output mode of functionExpr

timeoutConf

used to timeout groups that have not received data in a while

batchTimestampMs

processing timestamp of the current batch.

eventTimeWatermarkForLateEvents

event time watermark for filtering late events

eventTimeWatermarkForEviction

event time watermark for state eviction

child

logical plan of the underlying data

Linear Supertypes
FlatMapGroupsWithStateExecBase, WatermarkSupport, StateStoreWriter, PythonSQLMetrics, StatefulOperator, UnaryExecNode, UnaryLike[SparkPlan], SparkPlan, Serializable, Logging, QueryPlan[SparkPlan], SQLConfHelper, TreeNode[SparkPlan], TreePatternBits, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. FlatMapGroupsInPandasWithStateExec
  2. FlatMapGroupsWithStateExecBase
  3. WatermarkSupport
  4. StateStoreWriter
  5. PythonSQLMetrics
  6. StatefulOperator
  7. UnaryExecNode
  8. UnaryLike
  9. SparkPlan
  10. Serializable
  11. Logging
  12. QueryPlan
  13. SQLConfHelper
  14. TreeNode
  15. TreePatternBits
  16. Product
  17. Equals
  18. AnyRef
  19. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new FlatMapGroupsInPandasWithStateExec(functionExpr: Expression, groupingAttributes: Seq[Attribute], outAttributes: Seq[Attribute], stateType: StructType, stateInfo: Option[StatefulOperatorStateInfo], stateFormatVersion: Int, outputMode: OutputMode, timeoutConf: GroupStateTimeout, batchTimestampMs: Option[Long], eventTimeWatermarkForLateEvents: Option[Long], eventTimeWatermarkForEviction: Option[Long], child: SparkPlan)

    functionExpr

    function called on each group

    groupingAttributes

    used to group the data

    outAttributes

    used to define the output rows

    stateType

    used to serialize/deserialize state before calling functionExpr

    stateInfo

    StatefulOperatorStateInfo to identify the state store for a given operator.

    stateFormatVersion

    the version of state format.

    outputMode

    the output mode of functionExpr

    timeoutConf

    used to timeout groups that have not received data in a while

    batchTimestampMs

    processing timestamp of the current batch.

    eventTimeWatermarkForLateEvents

    event time watermark for filtering late events

    eventTimeWatermarkForEviction

    event time watermark for state eviction

    child

    logical plan of the underlying data

Type Members

  1. abstract class InputProcessor extends AnyRef

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. lazy val allAttributes: AttributeSeq
    Definition Classes
    QueryPlan
  5. def apply(number: Int): TreeNode[_]
    Definition Classes
    TreeNode
  6. def applyRemovingRowsOlderThanWatermark(iter: Iterator[InternalRow], predicateDropRowByWatermark: BasePredicate): Iterator[InternalRow]
    Attributes
    protected
    Definition Classes
    StateStoreWriter
  7. def argString(maxFields: Int): String
    Definition Classes
    TreeNode
  8. def asCode: String
    Definition Classes
    TreeNode
  9. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  10. val batchTimestampMs: Option[Long]
  11. final lazy val canonicalized: SparkPlan
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  12. val child: SparkPlan
    Definition Classes
    FlatMapGroupsInPandasWithStateExecWatermarkSupport → UnaryLike
  13. final lazy val children: Seq[SparkPlan]
    Definition Classes
    UnaryLike
    Annotations
    @transient()
  14. def cleanupResources(): Unit

    Cleans up the resources used by the physical operator (if any).

    Cleans up the resources used by the physical operator (if any). In general, all the resources should be cleaned up when the task finishes but operators like SortMergeJoinExec and LimitExec may want eager cleanup to free up tight resources (e.g., memory).

    Attributes
    protected[sql]
    Definition Classes
    SparkPlan
  15. def clone(): SparkPlan
    Definition Classes
    TreeNode → AnyRef
  16. def collect[B](pf: PartialFunction[SparkPlan, B]): Seq[B]
    Definition Classes
    TreeNode
  17. def collectFirst[B](pf: PartialFunction[SparkPlan, B]): Option[B]
    Definition Classes
    TreeNode
  18. def collectLeaves(): Seq[SparkPlan]
    Definition Classes
    TreeNode
  19. def collectWithSubqueries[B](f: PartialFunction[SparkPlan, B]): Seq[B]
    Definition Classes
    QueryPlan
  20. def conf: SQLConf
    Definition Classes
    SparkPlan → SQLConfHelper
  21. final def containsAllPatterns(patterns: TreePattern*): Boolean
    Definition Classes
    TreePatternBits
  22. final def containsAnyPattern(patterns: TreePattern*): Boolean
    Definition Classes
    TreePatternBits
  23. lazy val containsChild: Set[TreeNode[_]]
    Definition Classes
    TreeNode
  24. final def containsPattern(t: TreePattern): Boolean
    Definition Classes
    TreePatternBits
    Annotations
    @inline()
  25. def copyTagsFrom(other: SparkPlan): Unit
    Definition Classes
    TreeNode
  26. def createInputProcessor(store: StateStore): InputProcessor
  27. def customStatefulOperatorMetrics: Seq[StatefulOperatorCustomMetric]

    Set of stateful operator custom metrics.

    Set of stateful operator custom metrics. These are captured as part of the generic key-value map StateOperatorProgress.customMetrics. Stateful operators can extend this method to provide their own unique custom metrics.

    Attributes
    protected
    Definition Classes
    StateStoreWriter
  28. lazy val deterministic: Boolean
    Definition Classes
    QueryPlan
  29. def doCanonicalize(): SparkPlan
    Attributes
    protected
    Definition Classes
    QueryPlan
  30. def doExecute(): RDD[InternalRow]

    Produces the result of the query as an RDD[InternalRow]

    Produces the result of the query as an RDD[InternalRow]

    Overridden by concrete implementations of SparkPlan.

    Attributes
    protected
    Definition Classes
    FlatMapGroupsWithStateExecBaseSparkPlan
  31. def doExecuteBroadcast[T](): Broadcast[T]

    Produces the result of the query as a broadcast variable.

    Produces the result of the query as a broadcast variable.

    Overridden by concrete implementations of SparkPlan.

    Attributes
    protected[sql]
    Definition Classes
    SparkPlan
  32. def doExecuteColumnar(): RDD[ColumnarBatch]

    Produces the result of the query as an RDD[ColumnarBatch] if supportsColumnar returns true.

    Produces the result of the query as an RDD[ColumnarBatch] if supportsColumnar returns true. By convention the executor that creates a ColumnarBatch is responsible for closing it when it is no longer needed. This allows input formats to be able to reuse batches if needed.

    Attributes
    protected
    Definition Classes
    SparkPlan
  33. def doExecuteWrite(writeFilesSpec: WriteFilesSpec): RDD[WriterCommitMessage]

    Produces the result of the writes as an RDD[WriterCommitMessage]

    Produces the result of the writes as an RDD[WriterCommitMessage]

    Overridden by concrete implementations of SparkPlan.

    Attributes
    protected
    Definition Classes
    SparkPlan
  34. def doPrepare(): Unit

    Overridden by concrete implementations of SparkPlan.

    Overridden by concrete implementations of SparkPlan. It is guaranteed to run before any execute of SparkPlan. This is helpful if we want to set up some state before executing the query, e.g., BroadcastHashJoin uses it to broadcast asynchronously.

    Attributes
    protected
    Definition Classes
    SparkPlan
    Note

    prepare method has already walked down the tree, so the implementation doesn't have to call children's prepare methods. This will only be called once, protected by this.

  35. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  36. val eventTimeWatermarkForEviction: Option[Long]

    The watermark value for closing aggregates and evicting state.

    The watermark value for closing aggregates and evicting state. It is different from the late events filtering watermark (consider chained aggregators agg1 -> agg2: agg1 evicts state which will be effectively late against the eviction watermark but should not be late for agg2 input late record filtering watermark. Thus agg1 and agg2 use the current batch watermark for state eviction but the previous batch watermark for late record filtering.

    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBaseWatermarkSupport
  37. val eventTimeWatermarkForLateEvents: Option[Long]

    The watermark value for filtering late events/records.

    The watermark value for filtering late events/records. This should be the previous batch state eviction watermark.

    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBaseWatermarkSupport
  38. final def execute(): RDD[InternalRow]

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Returns the result of this query as an RDD[InternalRow] by delegating to doExecute after preparations.

    Concrete implementations of SparkPlan should override doExecute.

    Definition Classes
    SparkPlan
  39. final def executeBroadcast[T](): Broadcast[T]

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Returns the result of this query as a broadcast variable by delegating to doExecuteBroadcast after preparations.

    Concrete implementations of SparkPlan should override doExecuteBroadcast.

    Definition Classes
    SparkPlan
  40. def executeCollect(): Array[InternalRow]

    Runs this query returning the result as an array.

    Runs this query returning the result as an array.

    Definition Classes
    SparkPlan
  41. def executeCollectPublic(): Array[Row]

    Runs this query returning the result as an array, using external Row format.

    Runs this query returning the result as an array, using external Row format.

    Definition Classes
    SparkPlan
  42. final def executeColumnar(): RDD[ColumnarBatch]

    Returns the result of this query as an RDD[ColumnarBatch] by delegating to doColumnarExecute after preparations.

    Returns the result of this query as an RDD[ColumnarBatch] by delegating to doColumnarExecute after preparations.

    Concrete implementations of SparkPlan should override doColumnarExecute if supportsColumnar returns true.

    Definition Classes
    SparkPlan
  43. final def executeQuery[T](query: => T): T

    Executes a query after preparing the query and adding query plan information to created RDDs for visualization.

    Executes a query after preparing the query and adding query plan information to created RDDs for visualization.

    Attributes
    protected
    Definition Classes
    SparkPlan
  44. def executeTail(n: Int): Array[InternalRow]

    Runs this query returning the last n rows as an array.

    Runs this query returning the last n rows as an array.

    This is modeled after RDD.take but never runs any job locally on the driver.

    Definition Classes
    SparkPlan
  45. def executeTake(n: Int): Array[InternalRow]

    Runs this query returning the first n rows as an array.

    Runs this query returning the first n rows as an array.

    This is modeled after RDD.take but never runs any job locally on the driver.

    Definition Classes
    SparkPlan
  46. def executeToIterator(): Iterator[InternalRow]

    Runs this query returning the result as an iterator of InternalRow.

    Runs this query returning the result as an iterator of InternalRow.

    Definition Classes
    SparkPlan
    Note

    Triggers multiple jobs (one for each partition).

  47. def executeWrite(writeFilesSpec: WriteFilesSpec): RDD[WriterCommitMessage]

    Returns the result of writes as an RDD[WriterCommitMessage] variable by delegating to doExecuteWrite after preparations.

    Returns the result of writes as an RDD[WriterCommitMessage] variable by delegating to doExecuteWrite after preparations.

    Concrete implementations of SparkPlan should override doExecuteWrite.

    Definition Classes
    SparkPlan
  48. def exists(f: (SparkPlan) => Boolean): Boolean
    Definition Classes
    TreeNode
  49. final def expressions: Seq[Expression]
    Definition Classes
    QueryPlan
  50. def fastEquals(other: TreeNode[_]): Boolean
    Definition Classes
    TreeNode
  51. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  52. def find(f: (SparkPlan) => Boolean): Option[SparkPlan]
    Definition Classes
    TreeNode
  53. def flatMap[A](f: (SparkPlan) => TraversableOnce[A]): Seq[A]
    Definition Classes
    TreeNode
  54. def foreach(f: (SparkPlan) => Unit): Unit
    Definition Classes
    TreeNode
  55. def foreachUp(f: (SparkPlan) => Unit): Unit
    Definition Classes
    TreeNode
  56. def formattedNodeName: String
    Attributes
    protected
    Definition Classes
    QueryPlan
  57. val functionExpr: Expression
  58. def generateTreeString(depth: Int, lastChildren: Seq[Boolean], append: (String) => Unit, verbose: Boolean, prefix: String, addSuffix: Boolean, maxFields: Int, printNodeId: Boolean, indent: Int): Unit
    Definition Classes
    TreeNode
  59. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  60. def getDefaultTreePatternBits: BitSet
    Attributes
    protected
    Definition Classes
    TreeNode
  61. def getProgress(): StateOperatorProgress

    Get the progress made by this stateful operator after execution.

    Get the progress made by this stateful operator after execution. This should be called in the driver after this SparkPlan has been executed and metrics have been updated.

    Definition Classes
    StateStoreWriter
  62. def getStateInfo: StatefulOperatorStateInfo
    Attributes
    protected
    Definition Classes
    StatefulOperator
  63. def getTagValue[T](tag: TreeNodeTag[T]): Option[T]
    Definition Classes
    TreeNode
  64. val groupingAttributes: Seq[Attribute]
  65. val hasInitialState: Boolean
    Attributes
    protected
    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBase
  66. def hashCode(): Int
    Definition Classes
    TreeNode → AnyRef → Any
  67. val id: Int
    Definition Classes
    SparkPlan
  68. val initialState: SparkPlan
    Attributes
    protected
    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBase
  69. val initialStateDataAttrs: Seq[Attribute]
    Attributes
    protected
    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBase
  70. val initialStateDeserializer: Expression
    Attributes
    protected
    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBase
  71. val initialStateGroupAttrs: Seq[Attribute]
    Attributes
    protected
    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBase
  72. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  73. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  74. def innerChildren: Seq[QueryPlan[_]]
    Definition Classes
    QueryPlan → TreeNode
  75. def inputSet: AttributeSet
    Definition Classes
    QueryPlan
  76. def isCanonicalizedPlan: Boolean
    Attributes
    protected
    Definition Classes
    QueryPlan
  77. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  78. def isRuleIneffective(ruleId: RuleId): Boolean
    Attributes
    protected
    Definition Classes
    TreeNode
  79. val isTimeoutEnabled: Boolean
    Attributes
    protected
    Definition Classes
    FlatMapGroupsWithStateExecBase
  80. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  81. def jsonFields: List[JField]
    Attributes
    protected
    Definition Classes
    TreeNode
  82. def keyExpressions: Seq[Attribute]

    The keys that may have a watermark attribute.

    The keys that may have a watermark attribute.

    Definition Classes
    FlatMapGroupsWithStateExecBaseWatermarkSupport
  83. final def legacyWithNewChildren(newChildren: Seq[SparkPlan]): SparkPlan
    Attributes
    protected
    Definition Classes
    TreeNode
  84. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  85. def logDebug(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  86. def logDebug(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  87. def logError(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  88. def logError(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  89. def logInfo(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  90. def logInfo(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  91. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  92. def logTrace(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  93. def logTrace(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  94. def logWarning(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  95. def logWarning(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  96. def logicalLink: Option[LogicalPlan]

    returns

    The logical plan this plan is linked to.

    Definition Classes
    SparkPlan
  97. def longMetric(name: String): SQLMetric

    returns

    SQLMetric for the name.

    Definition Classes
    SparkPlan
  98. def makeCopy(newArgs: Array[AnyRef]): SparkPlan

    Overridden make copy also propagates sqlContext to copied plan.

    Overridden make copy also propagates sqlContext to copied plan.

    Definition Classes
    SparkPlan → TreeNode
  99. def map[A](f: (SparkPlan) => A): Seq[A]
    Definition Classes
    TreeNode
  100. final def mapChildren(f: (SparkPlan) => SparkPlan): SparkPlan
    Definition Classes
    UnaryLike
  101. def mapExpressions(f: (Expression) => Expression): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  102. def mapProductIterator[B](f: (Any) => B)(implicit arg0: ClassTag[B]): Array[B]
    Attributes
    protected
    Definition Classes
    TreeNode
  103. def markRuleAsIneffective(ruleId: RuleId): Unit
    Attributes
    protected
    Definition Classes
    TreeNode
  104. lazy val metrics: Map[String, SQLMetric]

    returns

    All metrics containing metrics of this SparkPlan.

    Definition Classes
    StateStoreWriter → PythonSQLMetrics → SparkPlan
  105. final def missingInput: AttributeSet
    Definition Classes
    QueryPlan
  106. def multiTransformDown(rule: PartialFunction[SparkPlan, Seq[SparkPlan]]): Stream[SparkPlan]
    Definition Classes
    TreeNode
  107. def multiTransformDownWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[SparkPlan, Seq[SparkPlan]]): Stream[SparkPlan]
    Definition Classes
    TreeNode
  108. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  109. def nodeName: String
    Definition Classes
    TreeNode
  110. val nodePatterns: Seq[TreePattern]
    Attributes
    protected
    Definition Classes
    TreeNode
  111. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  112. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  113. def numberedTreeString: String
    Definition Classes
    TreeNode
  114. val origin: Origin
    Definition Classes
    TreeNode
  115. def otherCopyArgs: Seq[AnyRef]
    Attributes
    protected
    Definition Classes
    TreeNode
  116. val outAttributes: Seq[Attribute]
  117. def output: Seq[Attribute]
    Definition Classes
    FlatMapGroupsInPandasWithStateExec → QueryPlan
  118. val outputMode: OutputMode
  119. def outputOrdering: Seq[SortOrder]
    Definition Classes
    QueryPlan
  120. def outputPartitioning: Partitioning

    Specifies how data is partitioned across different nodes in the cluster.

    Specifies how data is partitioned across different nodes in the cluster. Note this method may fail if it is invoked before EnsureRequirements is applied since PartitioningCollection requires all its partitionings to have the same number of partitions.

    Definition Classes
    SparkPlan
  121. lazy val outputSet: AttributeSet
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  122. def p(number: Int): SparkPlan
    Definition Classes
    TreeNode
  123. final def prepare(): Unit

    Prepares this SparkPlan for execution.

    Prepares this SparkPlan for execution. It's idempotent.

    Definition Classes
    SparkPlan
  124. def prepareSubqueries(): Unit

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Finds scalar subquery expressions in this plan node and starts evaluating them.

    Attributes
    protected
    Definition Classes
    SparkPlan
  125. def prettyJson: String
    Definition Classes
    TreeNode
  126. def printSchema(): Unit
    Definition Classes
    QueryPlan
  127. def processDataWithPartition(iter: Iterator[InternalRow], store: StateStore, processor: InputProcessor, initialStateIterOption: Option[Iterator[InternalRow]] = None): CompletionIterator[InternalRow, Iterator[InternalRow]]

    Process data by applying the user defined function on a per partition basis.

    Process data by applying the user defined function on a per partition basis.

    iter

    - Iterator of the data rows

    store

    - associated state store for this partition

    processor

    - handle to the input processor object.

    initialStateIterOption

    - optional initial state iterator

    Definition Classes
    FlatMapGroupsWithStateExecBase
  128. def producedAttributes: AttributeSet
    Definition Classes
    QueryPlan
  129. def productElementNames: Iterator[String]
    Definition Classes
    Product
  130. val pythonMetrics: Map[String, SQLMetric]
    Definition Classes
    PythonSQLMetrics
  131. lazy val references: AttributeSet
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  132. def removeKeysOlderThanWatermark(storeManager: StreamingAggregationStateManager, store: StateStore): Unit
    Attributes
    protected
    Definition Classes
    WatermarkSupport
  133. def removeKeysOlderThanWatermark(store: StateStore): Unit
    Attributes
    protected
    Definition Classes
    WatermarkSupport
  134. def requiredChildDistribution: Seq[Distribution]

    Distribute by grouping attributes - We need the underlying data and the initial state data to have the same grouping so that the data are co-lacated on the same task.

    Distribute by grouping attributes - We need the underlying data and the initial state data to have the same grouping so that the data are co-lacated on the same task.

    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBaseSparkPlan
  135. def requiredChildOrdering: Seq[Seq[SortOrder]]

    Ordering needed for using GroupingIterator.

    Ordering needed for using GroupingIterator. We need the initial state to also use the ordering as the data so that we can co-locate the keys from the underlying data and the initial state.

    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBaseSparkPlan
  136. def resetMetrics(): Unit

    Resets all the metrics.

    Resets all the metrics.

    Definition Classes
    SparkPlan
  137. def rewriteAttrs(attrMap: AttributeMap[Attribute]): SparkPlan
    Definition Classes
    QueryPlan
  138. final def sameResult(other: SparkPlan): Boolean
    Definition Classes
    QueryPlan
  139. lazy val schema: StructType
    Definition Classes
    QueryPlan
  140. def schemaString: String
    Definition Classes
    QueryPlan
  141. final def semanticHash(): Int
    Definition Classes
    QueryPlan
  142. final val session: SparkSession
    Definition Classes
    SparkPlan
  143. def setLogicalLink(logicalPlan: LogicalPlan): Unit

    Set logical plan link recursively if unset.

    Set logical plan link recursively if unset.

    Definition Classes
    SparkPlan
  144. def setOperatorMetrics(numStateStoreInstances: Int = 1): Unit

    Set the operator level metrics

    Set the operator level metrics

    Attributes
    protected
    Definition Classes
    StateStoreWriter
  145. def setStoreMetrics(store: StateStore): Unit

    Set the SQL metrics related to the state store.

    Set the SQL metrics related to the state store. This should be called in that task after the store has been updated.

    Attributes
    protected
    Definition Classes
    StateStoreWriter
  146. def setTagValue[T](tag: TreeNodeTag[T], value: T): Unit
    Definition Classes
    TreeNode
  147. def shortName: String

    Name to output in StreamingOperatorProgress to identify operator type

    Name to output in StreamingOperatorProgress to identify operator type

    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBaseStateStoreWriter
  148. def shouldRunAnotherBatch(newMetadata: OffsetSeqMetadata): Boolean

    Should the MicroBatchExecution run another batch based on this stateful operator and the current updated metadata.

    Should the MicroBatchExecution run another batch based on this stateful operator and the current updated metadata.

    Definition Classes
    FlatMapGroupsWithStateExecBaseStateStoreWriter
  149. def simpleString(maxFields: Int): String
    Definition Classes
    QueryPlan → TreeNode
  150. def simpleStringWithNodeId(): String
    Definition Classes
    QueryPlan → TreeNode
  151. def sparkContext: SparkContext
    Attributes
    protected
    Definition Classes
    SparkPlan
  152. val stateEncoder: ExpressionEncoder[Any]
    Attributes
    protected
    Definition Classes
    FlatMapGroupsInPandasWithStateExecFlatMapGroupsWithStateExecBase
  153. val stateFormatVersion: Int
  154. val stateInfo: Option[StatefulOperatorStateInfo]
  155. lazy val stateManager: StateManager
  156. def statePrefix: String
    Attributes
    protected
    Definition Classes
    QueryPlan
  157. val stateType: StructType
  158. def stringArgs: Iterator[Any]
    Attributes
    protected
    Definition Classes
    TreeNode
  159. lazy val subqueries: Seq[SparkPlan]
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  160. def subqueriesAll: Seq[SparkPlan]
    Definition Classes
    QueryPlan
  161. def supportsColumnar: Boolean

    Return true if this stage of the plan supports columnar execution.

    Return true if this stage of the plan supports columnar execution. A plan can also support row-based execution (see supportsRowBased). Spark will decide which execution to be called during query planning.

    Definition Classes
    SparkPlan
  162. def supportsRowBased: Boolean

    Return true if this stage of the plan supports row-based execution.

    Return true if this stage of the plan supports row-based execution. A plan can also support columnar execution (see supportsColumnar). Spark will decide which execution to be called during query planning.

    Definition Classes
    SparkPlan
  163. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  164. def timeTakenMs(body: => Unit): Long

    Records the duration of running body for the next query progress update.

    Records the duration of running body for the next query progress update.

    Attributes
    protected
    Definition Classes
    StateStoreWriter
  165. val timeoutConf: GroupStateTimeout
  166. def toJSON: String
    Definition Classes
    TreeNode
  167. def toRowBased: SparkPlan

    Converts the output of this plan to row-based if it is columnar plan.

    Converts the output of this plan to row-based if it is columnar plan.

    Definition Classes
    SparkPlan
  168. def toString(): String
    Definition Classes
    TreeNode → AnyRef → Any
  169. def transform(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  170. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  171. def transformAllExpressionsWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  172. def transformDown(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  173. def transformDownWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  174. def transformDownWithSubqueries(f: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    QueryPlan
  175. def transformDownWithSubqueriesAndPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(f: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    QueryPlan
  176. def transformExpressions(rule: PartialFunction[Expression, Expression]): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  177. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  178. def transformExpressionsDownWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  179. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  180. def transformExpressionsUpWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  181. def transformExpressionsWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[Expression, Expression]): FlatMapGroupsInPandasWithStateExec.this.type
    Definition Classes
    QueryPlan
  182. def transformUp(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  183. def transformUpWithBeforeAndAfterRuleOnChildren(cond: (SparkPlan) => Boolean, ruleId: RuleId)(rule: PartialFunction[(SparkPlan, SparkPlan), SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  184. def transformUpWithNewOutput(rule: PartialFunction[SparkPlan, (SparkPlan, Seq[(Attribute, Attribute)])], skipCond: (SparkPlan) => Boolean, canGetOutput: (SparkPlan) => Boolean): SparkPlan
    Definition Classes
    QueryPlan
  185. def transformUpWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  186. def transformUpWithSubqueries(f: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    QueryPlan
  187. def transformWithPruning(cond: (TreePatternBits) => Boolean, ruleId: RuleId)(rule: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  188. def transformWithSubqueries(f: PartialFunction[SparkPlan, SparkPlan]): SparkPlan
    Definition Classes
    QueryPlan
  189. lazy val treePatternBits: BitSet
    Definition Classes
    QueryPlan → TreeNode → TreePatternBits
  190. def treeString(append: (String) => Unit, verbose: Boolean, addSuffix: Boolean, maxFields: Int, printOperatorId: Boolean): Unit
    Definition Classes
    TreeNode
  191. final def treeString(verbose: Boolean, addSuffix: Boolean, maxFields: Int, printOperatorId: Boolean): String
    Definition Classes
    TreeNode
  192. final def treeString: String
    Definition Classes
    TreeNode
  193. def unsetTagValue[T](tag: TreeNodeTag[T]): Unit
    Definition Classes
    TreeNode
  194. def updateOuterReferencesInSubquery(plan: SparkPlan, attrMap: AttributeMap[Attribute]): SparkPlan
    Attributes
    protected
    Definition Classes
    QueryPlan
  195. def vectorTypes: Option[Seq[String]]

    The exact java types of the columns that are output in columnar processing mode.

    The exact java types of the columns that are output in columnar processing mode. This is a performance optimization for code generation and is optional.

    Definition Classes
    SparkPlan
  196. def verboseString(maxFields: Int): String
    Definition Classes
    QueryPlan → TreeNode
  197. def verboseStringWithOperatorId(): String
    Definition Classes
    UnaryExecNode → QueryPlan
  198. def verboseStringWithSuffix(maxFields: Int): String
    Definition Classes
    TreeNode
  199. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  200. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  201. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  202. def waitForSubqueries(): Unit

    Blocks the thread until all subqueries finish evaluation and update the results.

    Blocks the thread until all subqueries finish evaluation and update the results.

    Attributes
    protected
    Definition Classes
    SparkPlan
  203. lazy val watermarkExpressionForEviction: Option[Expression]

    Generate an expression that matches data older than the state eviction watermark

    Generate an expression that matches data older than the state eviction watermark

    Definition Classes
    WatermarkSupport
  204. lazy val watermarkExpressionForLateEvents: Option[Expression]

    Generate an expression that matches data older than late event filtering watermark

    Generate an expression that matches data older than late event filtering watermark

    Definition Classes
    WatermarkSupport
  205. lazy val watermarkPredicateForDataForEviction: Option[BasePredicate]
    Definition Classes
    WatermarkSupport
  206. lazy val watermarkPredicateForDataForLateEvents: Option[BasePredicate]

    Predicate based on the child output that matches data older than the watermark for late events filtering.

    Predicate based on the child output that matches data older than the watermark for late events filtering.

    Definition Classes
    WatermarkSupport
  207. lazy val watermarkPredicateForKeysForEviction: Option[BasePredicate]

    Generate an expression that matches data older than the state eviction watermark

    Generate an expression that matches data older than the state eviction watermark

    Definition Classes
    WatermarkSupport
  208. lazy val watermarkPredicateForKeysForLateEvents: Option[BasePredicate]

    Predicate based on keys that matches data older than the late event filtering watermark

    Predicate based on keys that matches data older than the late event filtering watermark

    Definition Classes
    WatermarkSupport
  209. val watermarkPresent: Boolean
    Attributes
    protected
    Definition Classes
    FlatMapGroupsWithStateExecBase
  210. def withNewChildInternal(newChild: SparkPlan): FlatMapGroupsInPandasWithStateExec
    Attributes
    protected
    Definition Classes
    FlatMapGroupsInPandasWithStateExec → UnaryLike
  211. final def withNewChildren(newChildren: Seq[SparkPlan]): SparkPlan
    Definition Classes
    TreeNode
  212. final def withNewChildrenInternal(newChildren: IndexedSeq[SparkPlan]): SparkPlan
    Definition Classes
    UnaryLike

Inherited from WatermarkSupport

Inherited from StateStoreWriter

Inherited from PythonSQLMetrics

Inherited from StatefulOperator

Inherited from UnaryExecNode

Inherited from UnaryLike[SparkPlan]

Inherited from SparkPlan

Inherited from Serializable

Inherited from Logging

Inherited from QueryPlan[SparkPlan]

Inherited from SQLConfHelper

Inherited from TreeNode[SparkPlan]

Inherited from TreePatternBits

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped