Class

org.apache.spark.sql.catalyst.catalog

CatalogTable

Related Doc: package catalog

Permalink

case class CatalogTable(identifier: TableIdentifier, tableType: CatalogTableType, storage: CatalogStorageFormat, schema: Seq[CatalogColumn], partitionColumnNames: Seq[String] = Seq.empty, sortColumnNames: Seq[String] = Seq.empty, bucketColumnNames: Seq[String] = Seq.empty, numBuckets: Int = 1, owner: String = "", createTime: Long = System.currentTimeMillis, lastAccessTime: Long = 1, properties: Map[String, String] = Map.empty, viewOriginalText: Option[String] = None, viewText: Option[String] = None, comment: Option[String] = None, unsupportedFeatures: Seq[String] = Seq.empty) extends Product with Serializable

A table defined in the catalog.

Note that Hive's metastore also tracks skewed columns. We should consider adding that in the future once we have a better understanding of how we want to handle skewed columns.

unsupportedFeatures

is a list of string descriptions of features that are used by the underlying table but not supported by Spark SQL yet.

Linear Supertypes
Serializable, Serializable, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. CatalogTable
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CatalogTable(identifier: TableIdentifier, tableType: CatalogTableType, storage: CatalogStorageFormat, schema: Seq[CatalogColumn], partitionColumnNames: Seq[String] = Seq.empty, sortColumnNames: Seq[String] = Seq.empty, bucketColumnNames: Seq[String] = Seq.empty, numBuckets: Int = 1, owner: String = "", createTime: Long = System.currentTimeMillis, lastAccessTime: Long = 1, properties: Map[String, String] = Map.empty, viewOriginalText: Option[String] = None, viewText: Option[String] = None, comment: Option[String] = None, unsupportedFeatures: Seq[String] = Seq.empty)

    Permalink

    unsupportedFeatures

    is a list of string descriptions of features that are used by the underlying table but not supported by Spark SQL yet.

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. val bucketColumnNames: Seq[String]

    Permalink
  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. val comment: Option[String]

    Permalink
  8. val createTime: Long

    Permalink
  9. def database: String

    Permalink

    Return the database this table was specified to belong to, assuming it exists.

  10. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  11. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  13. val identifier: TableIdentifier

    Permalink
  14. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  15. val lastAccessTime: Long

    Permalink
  16. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. val numBuckets: Int

    Permalink
  20. val owner: String

    Permalink
  21. val partitionColumnNames: Seq[String]

    Permalink
  22. def partitionColumns: Seq[CatalogColumn]

    Permalink

    Columns this table is partitioned by.

  23. val properties: Map[String, String]

    Permalink
  24. def qualifiedName: String

    Permalink

    Return the fully qualified name of this table, assuming the database was specified.

  25. val schema: Seq[CatalogColumn]

    Permalink
  26. val sortColumnNames: Seq[String]

    Permalink
  27. val storage: CatalogStorageFormat

    Permalink
  28. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  29. val tableType: CatalogTableType

    Permalink
  30. def toString(): String

    Permalink
    Definition Classes
    CatalogTable → AnyRef → Any
  31. val unsupportedFeatures: Seq[String]

    Permalink

    is a list of string descriptions of features that are used by the underlying table but not supported by Spark SQL yet.

  32. val viewOriginalText: Option[String]

    Permalink
  33. val viewText: Option[String]

    Permalink
  34. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  35. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  36. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  37. def withNewStorage(locationUri: Option[String] = storage.locationUri, inputFormat: Option[String] = storage.inputFormat, outputFormat: Option[String] = storage.outputFormat, compressed: Boolean = false, serde: Option[String] = storage.serde, serdeProperties: Map[String, String] = storage.serdeProperties): CatalogTable

    Permalink

    Syntactic sugar to update a field in storage.

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped