org.apache.spark.sql.hive

SnappyStoreHiveCatalog

class SnappyStoreHiveCatalog extends Catalog with Logging

Catalog using Hive for persistence and adding Snappy extensions like stream/topK tables and returning LogicalPlan to materialize these entities.

Linear Supertypes
Logging, Catalog, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SnappyStoreHiveCatalog
  2. Logging
  3. Catalog
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SnappyStoreHiveCatalog(context: SnappyContext)

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  7. var client: ClientInterface

    Hive client that is used to retrieve metadata from the Hive MetaStore.

    Hive client that is used to retrieve metadata from the Hive MetaStore. The version of the Hive client that is used here must match the meta-store that is configured in the hive-site.xml file.

    Attributes
    protected[org.apache.spark.sql]
  8. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  9. def compatibleSchema(schema1: StructType, schema2: StructType): Boolean

  10. val conf: SQLConf

    Definition Classes
    SnappyStoreHiveCatalog → Catalog
  11. def configure(): Map[String, String]

    Overridden by child classes that need to set configuration before client init (but after hive-site.

    Overridden by child classes that need to set configuration before client init (but after hive-site.xml).

    Attributes
    protected
  12. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  13. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  14. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  15. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  16. def getDataSourceRelations[T](tableTypes: Seq[Type], baseTable: Option[String] = None): Seq[T]

  17. def getDataSourceTables(tableTypes: Seq[Type], baseTable: Option[String] = None): Seq[QualifiedTableName]

  18. def getTableName(tableIdent: TableIdentifier): String

    Attributes
    protected
    Definition Classes
    Catalog
  19. def getTableType(relation: BaseRelation): Type

  20. def getTables(dbIdent: Option[String]): Seq[(String, Boolean)]

    Definition Classes
    SnappyStoreHiveCatalog → Catalog
  21. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  22. def hiveMetastoreBarrierPrefixes(): Seq[String]

    A comma separated list of class prefixes that should explicitly be reloaded for each version of Hive that Spark SQL is communicating with.

    A comma separated list of class prefixes that should explicitly be reloaded for each version of Hive that Spark SQL is communicating with. For example, Hive UDFs that are declared in a prefix that typically would be shared (i.e. org.apache.spark.*)

    Attributes
    protected[org.apache.spark.sql]
  23. def hiveMetastoreJars(): String

    The location of the jars that should be used to instantiate the Hive meta-store client.

    The location of the jars that should be used to instantiate the Hive meta-store client. This property can be one of three options:

    a classpath in the standard format for both hive and hadoop.

    builtin - attempt to discover the jars that were used to load Spark SQL and use those. This option is only valid when using the execution version of Hive.

    maven - download the correct version of hive on demand from maven.

    Attributes
    protected[org.apache.spark.sql]
  24. def hiveMetastoreSharedPrefixes(): Seq[String]

    A comma separated list of class prefixes that should be loaded using the ClassLoader that is shared between Spark SQL and a specific version of Hive.

    A comma separated list of class prefixes that should be loaded using the ClassLoader that is shared between Spark SQL and a specific version of Hive. An example of classes that should be shared is JDBC drivers that are needed to talk to the meta-store. Other classes that need to be shared are those that interact with classes that are already shared. For example, custom appender used by log4j.

    Attributes
    protected[org.apache.spark.sql]
  25. val hiveMetastoreVersion: String

    The version of the hive client that will be used to communicate with the meta-store for catalog.

    The version of the hive client that will be used to communicate with the meta-store for catalog.

    Attributes
    protected[org.apache.spark.sql]
  26. def invalidateTable(tableIdent: QualifiedTableName): Unit

  27. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  28. def isTraceEnabled(): Boolean

    Attributes
    protected
    Definition Classes
    Logging
  29. def log: Logger

    Attributes
    protected
    Definition Classes
    Logging
  30. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  31. def logDebug(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  32. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  33. def logError(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  34. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  35. def logInfo(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  36. def logName: String

    Attributes
    protected
    Definition Classes
    Logging
  37. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  38. def logTrace(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  39. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Attributes
    protected
    Definition Classes
    Logging
  40. def logWarning(msg: ⇒ String): Unit

    Attributes
    protected
    Definition Classes
    Logging
  41. def lookupRelation(tableIdent: TableIdentifier, alias: Option[String]): LogicalPlan

    Definition Classes
    SnappyStoreHiveCatalog → Catalog
  42. final def lookupRelation(tableIdent: QualifiedTableName): LogicalPlan

  43. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  44. def newQualifiedTableName(tableIdent: String): QualifiedTableName

  45. def newQualifiedTableName(tableIdent: TableIdentifier): QualifiedTableName

  46. def normalizeSchema(schema: StructType): StructType

  47. final def notify(): Unit

    Definition Classes
    AnyRef
  48. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  49. def processTableIdentifier(tableIdentifier: String): String

  50. def refreshTable(tableIdent: TableIdentifier): Unit

    Definition Classes
    SnappyStoreHiveCatalog → Catalog
  51. def registerDataSourceTable(tableIdent: QualifiedTableName, userSpecifiedSchema: Option[StructType], partitionColumns: Array[String], provider: String, options: Map[String, String], relation: BaseRelation): Unit

    Creates a data source table (a table created with USING clause) in Hive's meta-store.

  52. def registerTable(tableName: QualifiedTableName, plan: LogicalPlan): Unit

  53. def registerTable(tableIdentifier: TableIdentifier, plan: LogicalPlan): Unit

    Definition Classes
    SnappyStoreHiveCatalog → Catalog
  54. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  55. def tableExists(tableName: QualifiedTableName): Boolean

  56. def tableExists(tableIdentifier: String): Boolean

  57. def tableExists(tableIdentifier: TableIdentifier): Boolean

    Definition Classes
    SnappyStoreHiveCatalog → Catalog
  58. val tempTables: HashMap[QualifiedTableName, LogicalPlan]

  59. def toString(): String

    Definition Classes
    AnyRef → Any
  60. def unregisterAllTables(): Unit

    Definition Classes
    SnappyStoreHiveCatalog → Catalog
  61. def unregisterDataSourceTable(tableIdent: QualifiedTableName, relation: Option[BaseRelation]): Unit

    Drops a data source table from Hive's meta-store.

  62. def unregisterTable(tableIdent: QualifiedTableName): Unit

  63. def unregisterTable(tableIdentifier: TableIdentifier): Unit

    Definition Classes
    SnappyStoreHiveCatalog → Catalog
  64. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  65. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  66. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Logging

Inherited from Catalog

Inherited from AnyRef

Inherited from Any

Ungrouped