Packages

c

org.apache.spark.sql.execution.streaming

IncrementalExecution

class IncrementalExecution extends QueryExecution with Logging

A variant of QueryExecution that allows the execution of the given LogicalPlan plan incrementally. Possibly preserving state in between each execution.

Linear Supertypes
Logging, QueryExecution, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. IncrementalExecution
  2. Logging
  3. QueryExecution
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new IncrementalExecution(sparkSession: SparkSession, logicalPlan: LogicalPlan, outputMode: OutputMode, checkpointLocation: String, queryId: UUID, runId: UUID, currentBatchId: Long, offsetSeqMetadata: OffsetSeqMetadata)

Value Members

  1. object debug

    A special namespace for commands that can be used to debug query execution.

    A special namespace for commands that can be used to debug query execution.

    Definition Classes
    QueryExecution
  2. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  3. final def ##(): Int
    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  5. lazy val analyzed: LogicalPlan
    Definition Classes
    QueryExecution
  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def assertAnalyzed(): Unit
    Definition Classes
    QueryExecution
  8. def assertSupported(): Unit

    No need assert supported, as this check has already been done

    No need assert supported, as this check has already been done

    Definition Classes
    IncrementalExecutionQueryExecution
  9. val checkpointLocation: String
  10. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  11. val currentBatchId: Long
  12. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  14. def executePhase[T](phase: String)(block: ⇒ T): T
    Attributes
    protected
    Definition Classes
    QueryExecution
  15. lazy val executedPlan: SparkPlan
    Definition Classes
    QueryExecution
  16. def explainString(mode: ExplainMode): String
    Definition Classes
    QueryExecution
  17. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  18. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  19. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  20. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  21. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  22. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  23. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  24. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  25. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  26. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  27. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  28. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  29. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  30. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  31. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  32. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  35. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  36. val logical: LogicalPlan
    Definition Classes
    QueryExecution
  37. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  38. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  39. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  40. def observedMetrics: Map[String, Row]

    Get the metrics observed during the execution of the query plan.

    Get the metrics observed during the execution of the query plan.

    Definition Classes
    QueryExecution
  41. val offsetSeqMetadata: OffsetSeqMetadata
  42. lazy val optimizedPlan: LogicalPlan

    See [SPARK-18339] Walk the optimized logical plan and replace CurrentBatchTimestamp with the desired literal

    See [SPARK-18339] Walk the optimized logical plan and replace CurrentBatchTimestamp with the desired literal

    Definition Classes
    IncrementalExecutionQueryExecution
  43. val outputMode: OutputMode
  44. val planner: SparkPlanner
    Definition Classes
    IncrementalExecutionQueryExecution
  45. def preparations: Seq[Rule[SparkPlan]]
    Definition Classes
    IncrementalExecutionQueryExecution
  46. val queryId: UUID
  47. val runId: UUID
  48. def shouldRunAnotherBatch(newMetadata: OffsetSeqMetadata): Boolean

    Should the MicroBatchExecution run another batch based on this execution and the current updated metadata.

  49. def simpleString(formatted: Boolean): String
    Definition Classes
    QueryExecution
  50. def simpleString: String
    Definition Classes
    QueryExecution
  51. lazy val sparkPlan: SparkPlan
    Definition Classes
    QueryExecution
  52. val sparkSession: SparkSession
    Definition Classes
    QueryExecution
  53. val state: Rule[SparkPlan]

    Locates save/restore pairs surrounding aggregation.

  54. def stringWithStats: String
    Definition Classes
    QueryExecution
  55. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  56. lazy val toRdd: RDD[InternalRow]

    Internal version of the RDD.

    Internal version of the RDD. Avoids copies and has no schema. Note for callers: Spark may apply various optimization including reusing object: this means the row is valid only for the iteration it is retrieved. You should avoid storing row and accessing after iteration. (Calling collect() is one of known bad usage.) If you want to store these rows into collection, please apply some converter or copy row which produces new object per iteration. Given QueryExecution is not a public class, end users are discouraged to use this: please use Dataset.rdd instead where conversion will be applied.

    Definition Classes
    QueryExecution
  57. def toString(): String
    Definition Classes
    QueryExecution → AnyRef → Any
  58. val tracker: QueryPlanningTracker
    Definition Classes
    QueryExecution
  59. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  60. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  61. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  62. lazy val withCachedData: LogicalPlan
    Definition Classes
    QueryExecution

Inherited from Logging

Inherited from QueryExecution

Inherited from AnyRef

Inherited from Any

Ungrouped