Package

org.apache.spark.sql.catalyst

util

Permalink

package util

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. util
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. class ArrayBasedMapData extends MapData

    Permalink
  2. abstract class ArrayData extends SpecializedGetters with Serializable

    Permalink
  3. class GenericArrayData extends ArrayData

    Permalink
  4. abstract class MapData extends Serializable

    Permalink

    This is an internal data representation for map type in Spark SQL.

    This is an internal data representation for map type in Spark SQL. This should not implement equals and hashCode because the type cannot be used as join keys, grouping keys, or in equality tests. See SPARK-9415 and PR#13847 for the discussions.

  5. class StringKeyHashMap[T] extends AnyRef

    Permalink

Value Members

  1. object ArrayBasedMapData extends Serializable

    Permalink
  2. object DateTimeUtils

    Permalink

    Helper functions for converting between internal and external date and time representations.

    Helper functions for converting between internal and external date and time representations. Dates are exposed externally as java.sql.Date and are represented internally as the number of dates since the Unix epoch (1970-01-01). Timestamps are exposed externally as java.sql.Timestamp and are stored internally as longs, which are capable of storing timestamps with 100 nanosecond precision.

  3. object NumberConverter

    Permalink
  4. object StringKeyHashMap

    Permalink

    Build a map with String type of key, and it also supports either key case sensitive or insensitive.

  5. object StringUtils

    Permalink
  6. object TypeUtils

    Permalink

    Helper functions to check for valid data types.

  7. def benchmark[A](f: ⇒ A): A

    Permalink
  8. def fileToString(file: File, encoding: String = "UTF-8"): String

    Permalink
  9. def quietly[A](f: ⇒ A): A

    Permalink

    Silences output to stderr or stdout for the duration of f

  10. def quoteIdentifier(name: String): String

    Permalink
  11. def resourceToBytes(resource: String, classLoader: ClassLoader = Utils.getSparkClassLoader): Array[Byte]

    Permalink
  12. def resourceToString(resource: String, encoding: String = "UTF-8", classLoader: ClassLoader = Utils.getSparkClassLoader): String

    Permalink
  13. def sideBySide(left: Seq[String], right: Seq[String]): Seq[String]

    Permalink
  14. def sideBySide(left: String, right: String): Seq[String]

    Permalink
  15. def stackTraceToString(t: Throwable): String

    Permalink
  16. def stringOrNull(a: AnyRef): String

    Permalink
  17. def stringToFile(file: File, str: String): File

    Permalink
  18. def toPrettySQL(e: Expression): String

    Permalink
  19. def usePrettyExpression(e: Expression): Expression

    Permalink

Inherited from AnyRef

Inherited from Any

Ungrouped