org.apache.spark.sql.catalyst

util

package util

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. util
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Type Members

  1. class AbstractScalaRowIterator[T] extends Iterator[T]

    Shim to allow us to implement scala.Iterator in Java.

  2. class ArrayBasedMapData extends MapData

  3. abstract class ArrayData extends SpecializedGetters with Serializable

  4. class CaseInsensitiveMap[T] extends Map[String, T] with Serializable

    Builds a map in which keys are case insensitive.

  5. class GenericArrayData extends ArrayData

  6. abstract class MapData extends Serializable

    This is an internal data representation for map type in Spark SQL.

  7. class QuantileSummaries extends Serializable

    Helper class to compute approximate quantile summary.

  8. class StringKeyHashMap[T] extends AnyRef

Value Members

  1. object ArrayBasedMapData extends Serializable

  2. object ArrayData extends Serializable

  3. object CaseInsensitiveMap extends Serializable

  4. object CompressionCodecs

  5. object DateTimeUtils

    Helper functions for converting between internal and external date and time representations.

  6. object NumberConverter

  7. object ParseModes

  8. object QuantileSummaries extends Serializable

  9. object StringKeyHashMap

    Build a map with String type of key, and it also supports either key case sensitive or insensitive.

  10. object StringUtils

  11. object TypeUtils

    Helper functions to check for valid data types.

  12. def benchmark[A](f: ⇒ A): A

  13. def fileToString(file: File, encoding: String = "UTF-8"): String

  14. def quietly[A](f: ⇒ A): A

    Silences output to stderr or stdout for the duration of f

  15. def quoteIdentifier(name: String): String

  16. def resourceToBytes(resource: String, classLoader: ClassLoader = Utils.getSparkClassLoader): Array[Byte]

  17. def resourceToString(resource: String, encoding: String = "UTF-8", classLoader: ClassLoader = Utils.getSparkClassLoader): String

  18. def sideBySide(left: Seq[String], right: Seq[String]): Seq[String]

  19. def sideBySide(left: String, right: String): Seq[String]

  20. def stackTraceToString(t: Throwable): String

  21. def stringOrNull(a: AnyRef): String

  22. def stringToFile(file: File, str: String): File

  23. def toPrettySQL(e: Expression): String

  24. def usePrettyExpression(e: Expression): Expression

Inherited from AnyRef

Inherited from Any

Ungrouped