org.apache.spark.sql.catalyst.catalog

CatalogTable

case class CatalogTable(identifier: TableIdentifier, tableType: CatalogTableType, storage: CatalogStorageFormat, schema: Seq[CatalogColumn], partitionColumnNames: Seq[String] = collection.this.Seq.empty[Nothing], sortColumnNames: Seq[String] = collection.this.Seq.empty[Nothing], bucketColumnNames: Seq[String] = collection.this.Seq.empty[Nothing], numBuckets: Int = -1, owner: String = "", createTime: Long = ..., lastAccessTime: Long = -1, properties: Map[String, String] = ..., viewOriginalText: Option[String] = scala.None, viewText: Option[String] = scala.None, comment: Option[String] = scala.None, unsupportedFeatures: Seq[String] = collection.this.Seq.empty[Nothing]) extends Product with Serializable

A table defined in the catalog.

Note that Hive's metastore also tracks skewed columns. We should consider adding that in the future once we have a better understanding of how we want to handle skewed columns.

unsupportedFeatures

is a list of string descriptions of features that are used by the underlying table but not supported by Spark SQL yet.

Linear Supertypes
Serializable, Serializable, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. CatalogTable
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CatalogTable(identifier: TableIdentifier, tableType: CatalogTableType, storage: CatalogStorageFormat, schema: Seq[CatalogColumn], partitionColumnNames: Seq[String] = collection.this.Seq.empty[Nothing], sortColumnNames: Seq[String] = collection.this.Seq.empty[Nothing], bucketColumnNames: Seq[String] = collection.this.Seq.empty[Nothing], numBuckets: Int = -1, owner: String = "", createTime: Long = ..., lastAccessTime: Long = -1, properties: Map[String, String] = ..., viewOriginalText: Option[String] = scala.None, viewText: Option[String] = scala.None, comment: Option[String] = scala.None, unsupportedFeatures: Seq[String] = collection.this.Seq.empty[Nothing])

    unsupportedFeatures

    is a list of string descriptions of features that are used by the underlying table but not supported by Spark SQL yet.

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. val bucketColumnNames: Seq[String]

  8. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  9. val comment: Option[String]

  10. val createTime: Long

  11. def database: String

    Return the database this table was specified to belong to, assuming it exists.

  12. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  13. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  14. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  15. val identifier: TableIdentifier

  16. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  17. val lastAccessTime: Long

  18. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  19. final def notify(): Unit

    Definition Classes
    AnyRef
  20. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  21. val numBuckets: Int

  22. val owner: String

  23. val partitionColumnNames: Seq[String]

  24. def partitionColumns: Seq[CatalogColumn]

    Columns this table is partitioned by.

  25. val properties: Map[String, String]

  26. def qualifiedName: String

    Return the fully qualified name of this table, assuming the database was specified.

  27. val schema: Seq[CatalogColumn]

  28. val sortColumnNames: Seq[String]

  29. val storage: CatalogStorageFormat

  30. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  31. val tableType: CatalogTableType

  32. def toString(): String

    Definition Classes
    CatalogTable → AnyRef → Any
  33. val unsupportedFeatures: Seq[String]

    is a list of string descriptions of features that are used by the underlying table but not supported by Spark SQL yet.

  34. val viewOriginalText: Option[String]

  35. val viewText: Option[String]

  36. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  37. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  38. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. def withNewStorage(locationUri: Option[String] = storage.locationUri, inputFormat: Option[String] = storage.inputFormat, outputFormat: Option[String] = storage.outputFormat, compressed: Boolean = false, serde: Option[String] = storage.serde, serdeProperties: Map[String, String] = storage.serdeProperties): CatalogTable

    Syntactic sugar to update a field in storage.

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped