Packages

object SQLMetrics

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SQLMetrics
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. val cachedSQLAccumIdentifier: Some[String]
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
  7. def createAverageMetric(sc: SparkContext, name: String): SQLMetric

    Create a metric to report the average information (including min, med, max) like avg hash probe.

    Create a metric to report the average information (including min, med, max) like avg hash probe. As average metrics are double values, this kind of metrics should be only set with SQLMetric.set method instead of other methods like SQLMetric.add. The initial values (zeros) of this metrics will be excluded after.

  8. def createMetric(sc: SparkContext, name: String): SQLMetric
  9. def createNanoTimingMetric(sc: SparkContext, name: String, initValue: Long = -1): SQLMetric
  10. def createSizeMetric(sc: SparkContext, name: String, initValue: Long = -1): SQLMetric

    Create a metric to report the size information (including total, min, med, max) like data size, spill size, etc.

  11. def createTimingMetric(sc: SparkContext, name: String, initValue: Long = -1): SQLMetric
  12. def createV2CustomMetric(sc: SparkContext, customMetric: CustomMetric): SQLMetric

    Create a metric to report data source v2 custom metric.

  13. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  14. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  15. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  16. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  17. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  18. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  19. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  20. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  21. def postDriverMetricUpdates(sc: SparkContext, executionId: String, metrics: Seq[SQLMetric]): Unit

    Updates metrics based on the driver side value.

    Updates metrics based on the driver side value. This is useful for certain metrics that are only updated on the driver, e.g. subquery execution time, or number of files.

  22. def postDriverMetricsUpdatedByValue(sc: SparkContext, executionId: String, accumUpdates: Seq[(Long, Long)]): Unit
  23. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  24. def toString(): String
    Definition Classes
    AnyRef → Any
  25. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  26. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  27. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable]) @Deprecated
    Deprecated

    (Since version 9)

Inherited from AnyRef

Inherited from Any

Ungrouped