c

org.apache.hadoop.hbase.spark

HBaseRelation

case class HBaseRelation(parameters: Map[String, String], userSpecifiedSchema: Option[StructType])(sqlContext: SQLContext) extends BaseRelation with PrunedFilteredScan with InsertableRelation with Logging with Product with Serializable

Implementation of Spark BaseRelation that will build up our scan logic , do the scan pruning, filter push down, and value conversions

sqlContext

SparkSQL context

Annotations
@Private()
Linear Supertypes
Serializable, Serializable, Product, Equals, Logging, InsertableRelation, PrunedFilteredScan, BaseRelation, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. HBaseRelation
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. Logging
  7. InsertableRelation
  8. PrunedFilteredScan
  9. BaseRelation
  10. AnyRef
  11. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new HBaseRelation(parameters: Map[String, String], userSpecifiedSchema: Option[StructType])(sqlContext: SQLContext)

    sqlContext

    SparkSQL context

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. val batchNum: Int
  6. val blockCacheEnable: Boolean
  7. def buildPushDownPredicatesResource(filters: Array[Filter]): (RowKeyFilter, DynamicLogicExpression, Array[Array[Byte]])
  8. def buildRow(fields: Seq[Field], result: Result): Row
  9. def buildScan(requiredColumns: Array[String], filters: Array[Filter]): RDD[Row]

    Here we are building the functionality to populate the resulting RDD[Row] Here is where we will do the following: - Filter push down - Scan or GetList pruning - Executing our scan(s) or/and GetList to generate result

    Here we are building the functionality to populate the resulting RDD[Row] Here is where we will do the following: - Filter push down - Scan or GetList pruning - Executing our scan(s) or/and GetList to generate result

    requiredColumns

    The columns that are being requested by the requesting query

    filters

    The filters that are being applied by the requesting query

    returns

    RDD will all the results from HBase needed for SparkSQL to execute the query on

    Definition Classes
    HBaseRelation → PrunedFilteredScan
  10. val bulkGetSize: Int
  11. val cacheSize: Int
  12. val catalog: HBaseTableCatalog
  13. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  14. val clusteringCfColumnsMap: Map[String, Seq[String]]
  15. val configResources: String
  16. def createNamespaceIfNotExist(connection: Admin, namespace: String): Boolean
  17. def createTable(): Unit
  18. val darwinConf: Option[Config]
  19. val encoder: BytesEncoder
  20. val encoderClsName: String
  21. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  22. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  23. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  24. def getIndexedProjections(requiredColumns: Array[String]): Seq[(Field, Int)]
  25. def hbaseConf: Configuration
  26. val hbaseContext: HBaseContext
  27. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  28. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  29. def insert(data: DataFrame, overwrite: Boolean): Unit

    Definition Classes
    HBaseRelation → InsertableRelation
  30. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  31. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  32. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  33. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  34. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  35. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  36. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  37. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  39. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  40. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  41. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  42. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  43. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  44. val maxTimestamp: Option[Long]
  45. val maxVersions: Option[Int]
  46. val minTimestamp: Option[Long]
  47. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  48. def needConversion: Boolean
    Definition Classes
    BaseRelation
  49. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  50. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  51. val parameters: Map[String, String]
  52. def parseRowKey(row: Array[Byte], keyFields: Seq[Field]): Map[Field, Any]

    Takes a HBase Row object and parses all of the fields from it.

    Takes a HBase Row object and parses all of the fields from it. This is independent of which fields were requested from the key Because we have all the data it's less complex to parse everything.

    row

    the retrieved row from hbase.

    keyFields

    all of the fields in the row key, ORDERED by their order in the row key.

  53. val schema: StructType

    Generates a Spark SQL schema objeparametersct so Spark SQL knows what is being provided by this BaseRelation

    Generates a Spark SQL schema objeparametersct so Spark SQL knows what is being provided by this BaseRelation

    returns

    schema generated from the SCHEMA_COLUMNS_MAPPING_KEY value

    Definition Classes
    HBaseRelation → BaseRelation
  54. def sizeInBytes: Long
    Definition Classes
    BaseRelation
  55. val sqlContext: SQLContext
    Definition Classes
    HBaseRelation → BaseRelation
  56. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  57. def tableName: String
  58. val timestamp: Option[Long]
  59. def transverseFilterTree(parentRowKeyFilter: RowKeyFilter, valueArray: MutableList[Array[Byte]], filter: Filter): DynamicLogicExpression

    For some codec, the order may be inconsistent between java primitive type and its byte array.

    For some codec, the order may be inconsistent between java primitive type and its byte array. We may have to split the predicates on some of the java primitive type into multiple predicates. The encoder will take care of it and returning the concrete ranges.

    For example in naive codec, some of the java primitive types have to be split into multiple predicates, and union these predicates together to make the predicates be performed correctly. For example, if we have "COLUMN < 2", we will transform it into "0 <= COLUMN < 2 OR Integer.MIN_VALUE <= COLUMN <= -1"

  60. def unhandledFilters(filters: Array[Filter]): Array[Filter]
    Definition Classes
    BaseRelation
  61. val useHBaseContext: Boolean
  62. val usePushDownColumnFilter: Boolean
  63. val useSchemaAvroManager: Boolean
  64. val userSpecifiedSchema: Option[StructType]
  65. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  66. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  67. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  68. val wrappedConf: SerializableConfiguration

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from Logging

Inherited from InsertableRelation

Inherited from PrunedFilteredScan

Inherited from BaseRelation

Inherited from AnyRef

Inherited from Any

Ungrouped