Package

org.apache.spark.sql.catalyst

catalog

Permalink

package catalog

Visibility
  1. Public
  2. All

Type Members

  1. case class BucketSpec(numBuckets: Int, bucketColumnNames: Seq[String], sortColumnNames: Seq[String]) extends Product with Serializable

    Permalink

    A container for bucketing information.

    A container for bucketing information. Bucketing is a technology for decomposing data sets into more manageable parts, and the number of buckets is fixed so it does not fluctuate with data.

    numBuckets

    number of buckets.

    bucketColumnNames

    the names of the columns that used to generate the bucket id.

    sortColumnNames

    the names of the columns that used to sort data in each bucket.

  2. case class CatalogDatabase(name: String, description: String, locationUri: String, properties: Map[String, String]) extends Product with Serializable

    Permalink

    A database defined in the catalog.

  3. case class CatalogFunction(identifier: FunctionIdentifier, className: String, resources: Seq[FunctionResource]) extends Product with Serializable

    Permalink

    A function defined in the catalog.

    A function defined in the catalog.

    identifier

    name of the function

    className

    fully qualified class name, e.g. "org.apache.spark.util.MyFunc"

    resources

    resource types and Uris used by the function

  4. trait CatalogRelation extends AnyRef

    Permalink

    An interface that is implemented by logical plans to return the underlying catalog table.

    An interface that is implemented by logical plans to return the underlying catalog table. If we can in the future consolidate SimpleCatalogRelation and MetastoreRelation, we should probably remove this interface.

  5. case class CatalogStorageFormat(locationUri: Option[String], inputFormat: Option[String], outputFormat: Option[String], serde: Option[String], compressed: Boolean, properties: Map[String, String]) extends Product with Serializable

    Permalink

    Storage format, used to describe how a partition or a table is stored.

  6. case class CatalogTable(identifier: TableIdentifier, tableType: CatalogTableType, storage: CatalogStorageFormat, schema: StructType, provider: Option[String] = None, partitionColumnNames: Seq[String] = Seq.empty, bucketSpec: Option[BucketSpec] = None, owner: String = "", createTime: Long = System.currentTimeMillis, lastAccessTime: Long = 1, properties: Map[String, String] = Map.empty, stats: Option[Statistics] = None, viewOriginalText: Option[String] = None, viewText: Option[String] = None, comment: Option[String] = None, unsupportedFeatures: Seq[String] = Seq.empty, tracksPartitionsInCatalog: Boolean = false) extends Product with Serializable

    Permalink

    A table defined in the catalog.

    A table defined in the catalog.

    Note that Hive's metastore also tracks skewed columns. We should consider adding that in the future once we have a better understanding of how we want to handle skewed columns.

    provider

    the name of the data source provider for this table, e.g. parquet, json, etc. Can be None if this table is a View, should be "hive" for hive serde tables.

    unsupportedFeatures

    is a list of string descriptions of features that are used by the underlying table but not supported by Spark SQL yet.

    tracksPartitionsInCatalog

    whether this table's partition metadata is stored in the catalog. If false, it is inferred automatically based on file structure.

  7. case class CatalogTablePartition(spec: TablePartitionSpec, storage: CatalogStorageFormat, parameters: Map[String, String] = Map.empty) extends Product with Serializable

    Permalink

    A partition (Hive style) defined in the catalog.

    A partition (Hive style) defined in the catalog.

    spec

    partition spec values indexed by column name

    storage

    storage format of the partition

    parameters

    some parameters for the partition, for example, stats.

  8. case class CatalogTableType extends Product with Serializable

    Permalink
  9. abstract class ExternalCatalog extends AnyRef

    Permalink

    Interface for the system catalog (of functions, partitions, tables, and databases).

    Interface for the system catalog (of functions, partitions, tables, and databases).

    This is only used for non-temporary items, and implementations must be thread-safe as they can be accessed in multiple threads. This is an external catalog because it is expected to interact with external systems.

    Implementations should throw NoSuchDatabaseException when databases don't exist.

  10. case class FunctionResource(resourceType: FunctionResourceType, uri: String) extends Product with Serializable

    Permalink
  11. trait FunctionResourceLoader extends AnyRef

    Permalink

    A simple trait representing a class that can be used to load resources used by a function.

    A simple trait representing a class that can be used to load resources used by a function. Because only a SQLContext can load resources, we create this trait to avoid of explicitly passing SQLContext around.

  12. abstract class FunctionResourceType extends AnyRef

    Permalink

    A trait that represents the type of a resourced needed by a function.

  13. class GlobalTempViewManager extends AnyRef

    Permalink

    A thread-safe manager for global temporary views, providing atomic operations to manage them, e.g.

    A thread-safe manager for global temporary views, providing atomic operations to manage them, e.g. create, update, remove, etc.

    Note that, the view name is always case-sensitive here, callers are responsible to format the view name w.r.t. case-sensitive config.

  14. class InMemoryCatalog extends ExternalCatalog

    Permalink

    An in-memory (ephemeral) implementation of the system catalog.

    An in-memory (ephemeral) implementation of the system catalog.

    This is a dummy implementation that does not require setting up external systems. It is intended for testing or exploration purposes only and should not be used in production.

    All public methods should be synchronized for thread-safety.

  15. class SessionCatalog extends Logging

    Permalink

    An internal catalog that is used by a Spark Session.

    An internal catalog that is used by a Spark Session. This internal catalog serves as a proxy to the underlying metastore (e.g. Hive Metastore) and it also manages temporary tables and functions of the Spark Session that it belongs to.

    This class must be thread-safe.

  16. case class SimpleCatalogRelation(databaseName: String, metadata: CatalogTable) extends LeafNode with CatalogRelation with Product with Serializable

    Permalink

    A LogicalPlan that wraps CatalogTable.

    A LogicalPlan that wraps CatalogTable.

    Note that in the future we should consolidate this and HiveCatalogRelation.

Ungrouped