Packages

object SQLMetrics

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SQLMetrics
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. val cachedSQLAccumIdentifier: Some[String]
  6. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  7. def createAverageMetric(sc: SparkContext, name: String): SQLMetric

    Create a metric to report the average information (including min, med, max) like avg hash probe.

    Create a metric to report the average information (including min, med, max) like avg hash probe. As average metrics are double values, this kind of metrics should be only set with SQLMetric.set method instead of other methods like SQLMetric.add. The initial values (zeros) of this metrics will be excluded after.

  8. def createMetric(sc: SparkContext, name: String): SQLMetric
  9. def createNanoTimingMetric(sc: SparkContext, name: String): SQLMetric
  10. def createSizeMetric(sc: SparkContext, name: String): SQLMetric

    Create a metric to report the size information (including total, min, med, max) like data size, spill size, etc.

  11. def createTimingMetric(sc: SparkContext, name: String): SQLMetric
  12. def createV2CustomMetric(sc: SparkContext, customMetric: CustomMetric): SQLMetric

    Create a metric to report data source v2 custom metric.

  13. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  14. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  15. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  16. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  17. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  18. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  19. def metricNeedsMax(metricsType: String): Boolean
  20. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  21. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  22. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  23. def postDriverMetricUpdates(sc: SparkContext, executionId: String, metrics: Seq[SQLMetric]): Unit

    Updates metrics based on the driver side value.

    Updates metrics based on the driver side value. This is useful for certain metrics that are only updated on the driver, e.g. subquery execution time, or number of files.

  24. def postDriverMetricsUpdatedByValue(sc: SparkContext, executionId: String, accumUpdates: Seq[(Long, Long)]): Unit
  25. def stringValue(metricsType: String, values: Array[Long], maxMetrics: Array[Long]): String

    A function that defines how we aggregate the final accumulator results among all tasks, and represent it in string for a SQL physical operator.

  26. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  27. def toString(): String
    Definition Classes
    AnyRef → Any
  28. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  29. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  30. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped