object SQLMetrics
- Alphabetic
- By Inheritance
- SQLMetrics
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- val cachedSQLAccumIdentifier: Some[String]
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- def createAverageMetric(sc: SparkContext, name: String): SQLMetric
Create a metric to report the average information (including min, med, max) like avg hash probe.
Create a metric to report the average information (including min, med, max) like avg hash probe. As average metrics are double values, this kind of metrics should be only set with
SQLMetric.set
method instead of other methods likeSQLMetric.add
. The initial values (zeros) of this metrics will be excluded after. - def createMetric(sc: SparkContext, name: String): SQLMetric
- def createNanoTimingMetric(sc: SparkContext, name: String): SQLMetric
- def createSizeMetric(sc: SparkContext, name: String): SQLMetric
Create a metric to report the size information (including total, min, med, max) like data size, spill size, etc.
- def createTimingMetric(sc: SparkContext, name: String): SQLMetric
- def createV2CustomMetric(sc: SparkContext, customMetric: CustomMetric): SQLMetric
Create a metric to report data source v2 custom metric.
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- def metricNeedsMax(metricsType: String): Boolean
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def postDriverMetricUpdates(sc: SparkContext, executionId: String, metrics: Seq[SQLMetric]): Unit
Updates metrics based on the driver side value.
Updates metrics based on the driver side value. This is useful for certain metrics that are only updated on the driver, e.g. subquery execution time, or number of files.
- def postDriverMetricsUpdatedByValue(sc: SparkContext, executionId: String, accumUpdates: Seq[(Long, Long)]): Unit
- def stringValue(metricsType: String, values: Array[Long], maxMetrics: Array[Long]): String
A function that defines how we aggregate the final accumulator results among all tasks, and represent it in string for a SQL physical operator.
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()