Packages

c

org.apache.spark.sql.execution.streaming

IncrementalExecution

class IncrementalExecution extends QueryExecution with Logging

A variant of QueryExecution that allows the execution of the given LogicalPlan plan incrementally. Possibly preserving state in between each execution.

Linear Supertypes
QueryExecution, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. IncrementalExecution
  2. QueryExecution
  3. Logging
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Instance Constructors

  1. new IncrementalExecution(sparkSession: SparkSession, logicalPlan: LogicalPlan, outputMode: OutputMode, checkpointLocation: String, queryId: UUID, runId: UUID, currentBatchId: Long, prevOffsetSeqMetadata: Option[OffsetSeqMetadata], offsetSeqMetadata: OffsetSeqMetadata, watermarkPropagator: WatermarkPropagator, isFirstBatch: Boolean)

Type Members

  1. implicit class LogStringContext extends AnyRef
    Definition Classes
    Logging
  2. sealed trait SparkPlanPartialRule extends AnyRef

Value Members

  1. object debug

    A special namespace for commands that can be used to debug query execution.

    A special namespace for commands that can be used to debug query execution.

    Definition Classes
    QueryExecution
  2. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  3. final def ##: Int
    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  5. lazy val analyzed: LogicalPlan
    Definition Classes
    QueryExecution
  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def assertAnalyzed(): Unit

    no need to try-catch again as this is already done once

    no need to try-catch again as this is already done once

    Definition Classes
    IncrementalExecutionQueryExecution
  8. def assertCommandExecuted(): Unit
    Definition Classes
    QueryExecution
  9. def assertExecutedPlanPrepared(): Unit
    Definition Classes
    QueryExecution
  10. def assertOptimized(): Unit
    Definition Classes
    QueryExecution
  11. def assertSparkPlanPrepared(): Unit
    Definition Classes
    QueryExecution
  12. def assertSupported(): Unit

    No need assert supported, as this check has already been done

    No need assert supported, as this check has already been done

    Definition Classes
    IncrementalExecutionQueryExecution
  13. val checkpointLocation: String
  14. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
  15. lazy val commandExecuted: LogicalPlan
    Definition Classes
    QueryExecution
  16. val currentBatchId: Long
  17. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  18. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  19. def executePhase[T](phase: String)(block: => T): T
    Attributes
    protected
    Definition Classes
    QueryExecution
  20. lazy val executedPlan: SparkPlan
    Definition Classes
    QueryExecution
  21. def explainString(mode: ExplainMode): String
    Definition Classes
    QueryExecution
  22. def extendedExplainInfo(append: (String) => Unit, plan: SparkPlan): Unit
    Definition Classes
    QueryExecution
  23. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  24. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  25. val id: Long
    Definition Classes
    QueryExecution
  26. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  27. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  28. val isFirstBatch: Boolean
  29. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  30. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  31. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  32. def logDebug(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logDebug(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logDebug(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    Logging
  35. def logDebug(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  36. def logError(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  37. def logError(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def logError(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    Logging
  39. def logError(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  40. def logInfo(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  41. def logInfo(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  42. def logInfo(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    Logging
  43. def logInfo(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  44. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  45. def logTrace(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  46. def logTrace(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  47. def logTrace(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    Logging
  48. def logTrace(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  49. def logWarning(msg: => String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  50. def logWarning(entry: LogEntry, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  51. def logWarning(entry: LogEntry): Unit
    Attributes
    protected
    Definition Classes
    Logging
  52. def logWarning(msg: => String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  53. val logical: LogicalPlan
    Definition Classes
    QueryExecution
  54. val mode: CommandExecutionMode.Value
    Definition Classes
    QueryExecution
  55. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  56. lazy val normalized: LogicalPlan
    Definition Classes
    QueryExecution
  57. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  58. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  59. def observedMetrics: Map[String, Row]

    Get the metrics observed during the execution of the query plan.

    Get the metrics observed during the execution of the query plan.

    Definition Classes
    QueryExecution
  60. val offsetSeqMetadata: OffsetSeqMetadata
  61. lazy val optimizedPlan: LogicalPlan

    See [SPARK-18339] Walk the optimized logical plan and replace CurrentBatchTimestamp with the desired literal

    See [SPARK-18339] Walk the optimized logical plan and replace CurrentBatchTimestamp with the desired literal

    Definition Classes
    IncrementalExecutionQueryExecution
  62. val outputMode: OutputMode
  63. val planner: SparkPlanner
    Definition Classes
    IncrementalExecutionQueryExecution
  64. def preparations: Seq[Rule[SparkPlan]]
    Definition Classes
    IncrementalExecutionQueryExecution
  65. val prevOffsetSeqMetadata: Option[OffsetSeqMetadata]
  66. val queryId: UUID
  67. val runId: UUID
  68. def shouldRunAnotherBatch(newMetadata: OffsetSeqMetadata): Boolean

    Should the MicroBatchExecution run another batch based on this execution and the current updated metadata.

    Should the MicroBatchExecution run another batch based on this execution and the current updated metadata.

    This method performs simulation of watermark propagation against new batch (which is not planned yet), which is required for asking the needs of another batch to each stateful operator.

  69. val shuffleCleanupMode: ShuffleCleanupMode
    Definition Classes
    QueryExecution
  70. def simpleString: String
    Definition Classes
    QueryExecution
  71. lazy val sparkPlan: SparkPlan
    Definition Classes
    QueryExecution
  72. val sparkSession: SparkSession
    Definition Classes
    QueryExecution
  73. val state: Rule[SparkPlan]
  74. def stringWithStats: String
    Definition Classes
    QueryExecution
  75. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  76. lazy val toRdd: RDD[InternalRow]

    Internal version of the RDD.

    Internal version of the RDD. Avoids copies and has no schema. Note for callers: Spark may apply various optimization including reusing object: this means the row is valid only for the iteration it is retrieved. You should avoid storing row and accessing after iteration. (Calling collect() is one of known bad usage.) If you want to store these rows into collection, please apply some converter or copy row which produces new object per iteration. Given QueryExecution is not a public class, end users are discouraged to use this: please use Dataset.rdd instead where conversion will be applied.

    Definition Classes
    QueryExecution
  77. def toString(): String
    Definition Classes
    QueryExecution → AnyRef → Any
  78. val tracker: QueryPlanningTracker
    Definition Classes
    QueryExecution
  79. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  80. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  81. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  82. val watermarkPropagator: WatermarkPropagator
  83. lazy val withCachedData: LogicalPlan
    Definition Classes
    QueryExecution
  84. def withLogContext(context: HashMap[String, String])(body: => Unit): Unit
    Attributes
    protected
    Definition Classes
    Logging
  85. object ConvertLocalLimitRule extends SparkPlanPartialRule
  86. object ShufflePartitionsRule extends SparkPlanPartialRule
  87. object StateOpIdRule extends SparkPlanPartialRule
  88. object WatermarkPropagationRule extends SparkPlanPartialRule
  89. object WriteStatefulOperatorMetadataRule extends SparkPlanPartialRule

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable]) @Deprecated
    Deprecated

    (Since version 9)

Inherited from QueryExecution

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped