Packages

c

org.apache.spark.sql.execution.streaming

IncrementalExecution

class IncrementalExecution extends QueryExecution with Logging

A variant of QueryExecution that allows the execution of the given LogicalPlan plan incrementally. Possibly preserving state in between each execution.

Linear Supertypes
Logging, QueryExecution, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. IncrementalExecution
  2. Logging
  3. QueryExecution
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new IncrementalExecution(sparkSession: SparkSession, logicalPlan: LogicalPlan, outputMode: OutputMode, checkpointLocation: String, queryId: UUID, runId: UUID, currentBatchId: Long, offsetSeqMetadata: OffsetSeqMetadata)

Value Members

  1. object debug

    A special namespace for commands that can be used to debug query execution.

    A special namespace for commands that can be used to debug query execution.

    Definition Classes
    QueryExecution
  2. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  3. final def ##(): Int
    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  5. lazy val analyzed: LogicalPlan
    Definition Classes
    QueryExecution
  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def assertAnalyzed(): Unit
    Definition Classes
    QueryExecution
  8. def assertSupported(): Unit

    No need assert supported, as this check has already been done

    No need assert supported, as this check has already been done

    Definition Classes
    IncrementalExecutionQueryExecution
  9. val checkpointLocation: String
  10. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  11. val currentBatchId: Long
  12. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  14. lazy val executedPlan: SparkPlan
    Definition Classes
    QueryExecution
  15. def explainString(mode: ExplainMode): String
    Definition Classes
    QueryExecution
  16. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  17. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  18. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  19. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  20. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  21. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  22. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  23. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  24. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  25. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  26. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  27. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  28. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  29. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  30. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  31. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  32. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  33. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  35. val logical: LogicalPlan
    Definition Classes
    QueryExecution
  36. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  37. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  38. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  39. def observedMetrics: Map[String, Row]

    Get the metrics observed during the execution of the query plan.

    Get the metrics observed during the execution of the query plan.

    Definition Classes
    QueryExecution
  40. val offsetSeqMetadata: OffsetSeqMetadata
  41. lazy val optimizedPlan: LogicalPlan

    See [SPARK-18339] Walk the optimized logical plan and replace CurrentBatchTimestamp with the desired literal

    See [SPARK-18339] Walk the optimized logical plan and replace CurrentBatchTimestamp with the desired literal

    Definition Classes
    IncrementalExecutionQueryExecution
  42. val outputMode: OutputMode
  43. val planner: SparkPlanner
    Definition Classes
    IncrementalExecutionQueryExecution
  44. def preparations: Seq[Rule[SparkPlan]]
    Definition Classes
    IncrementalExecutionQueryExecution
  45. val queryId: UUID
  46. val runId: UUID
  47. def shouldRunAnotherBatch(newMetadata: OffsetSeqMetadata): Boolean

    Should the MicroBatchExecution run another batch based on this execution and the current updated metadata.

  48. def simpleString(formatted: Boolean): String
    Definition Classes
    QueryExecution
  49. def simpleString: String
    Definition Classes
    QueryExecution
  50. lazy val sparkPlan: SparkPlan
    Definition Classes
    QueryExecution
  51. val sparkSession: SparkSession
    Definition Classes
    QueryExecution
  52. val state: Rule[SparkPlan]

    Locates save/restore pairs surrounding aggregation.

  53. def stringWithStats: String
    Definition Classes
    QueryExecution
  54. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  55. lazy val toRdd: RDD[InternalRow]

    Internal version of the RDD.

    Internal version of the RDD. Avoids copies and has no schema. Note for callers: Spark may apply various optimization including reusing object: this means the row is valid only for the iteration it is retrieved. You should avoid storing row and accessing after iteration. (Calling collect() is one of known bad usage.) If you want to store these rows into collection, please apply some converter or copy row which produces new object per iteration. Given QueryExecution is not a public class, end users are discouraged to use this: please use Dataset.rdd instead where conversion will be applied.

    Definition Classes
    QueryExecution
  56. def toString(): String
    Definition Classes
    QueryExecution → AnyRef → Any
  57. val tracker: QueryPlanningTracker
    Definition Classes
    QueryExecution
  58. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  59. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  60. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  61. lazy val withCachedData: LogicalPlan
    Definition Classes
    QueryExecution

Inherited from Logging

Inherited from QueryExecution

Inherited from AnyRef

Inherited from Any

Ungrouped