A column in a table.
A database defined in the catalog.
A function defined in the catalog.
An interface that is implemented by logical plans to return the underlying catalog table.
An interface that is implemented by logical plans to return the underlying catalog table. If we can in the future consolidate SimpleCatalogRelation and MetastoreRelation, we should probably remove this interface.
Storage format, used to describe how a partition or a table is stored.
A table defined in the catalog.
A table defined in the catalog.
Note that Hive's metastore also tracks skewed columns. We should consider adding that in the future once we have a better understanding of how we want to handle skewed columns.
is a list of string descriptions of features that are used by the underlying table but not supported by Spark SQL yet.
A partition (Hive style) defined in the catalog.
A partition (Hive style) defined in the catalog.
partition spec values indexed by column name
storage format of the partition
some parameters for the partition, for example, stats.
Interface for the system catalog (of columns, partitions, tables, and databases).
Interface for the system catalog (of columns, partitions, tables, and databases).
This is only used for non-temporary items, and implementations must be thread-safe as they can be accessed in multiple threads. This is an external catalog because it is expected to interact with external systems.
Implementations should throw NoSuchDatabaseException when databases don't exist.
A simple trait representing a class that can be used to load resources used by a function.
A simple trait representing a class that can be used to load resources used by a function. Because only a SQLContext can load resources, we create this trait to avoid of explicitly passing SQLContext around.
A trait that represents the type of a resourced needed by a function.
An in-memory (ephemeral) implementation of the system catalog.
An in-memory (ephemeral) implementation of the system catalog.
This is a dummy implementation that does not require setting up external systems. It is intended for testing or exploration purposes only and should not be used in production.
All public methods should be synchronized for thread-safety.
An internal catalog that is used by a Spark Session.
An internal catalog that is used by a Spark Session. This internal catalog serves as a proxy to the underlying metastore (e.g. Hive Metastore) and it also manages temporary tables and functions of the Spark Session that it belongs to.
This class must be thread-safe.
A LogicalPlan that wraps CatalogTable.
A LogicalPlan that wraps CatalogTable.
Note that in the future we should consolidate this and HiveCatalogRelation.
A function defined in the catalog.
name of the function
fully qualified class name, e.g. "org.apache.spark.util.MyFunc"
resource types and Uris used by the function