Package

org.apache.hadoop.hbase

spark

Permalink

package spark

Visibility
  1. Public
  2. All

Type Members

  1. class AndLogicExpression extends DynamicLogicExpression

    Permalink
    Annotations
    @Private()
  2. class BulkLoadPartitioner extends Partitioner

    Permalink

    A Partitioner implementation that will separate records to different HBase Regions based on region splits

    A Partitioner implementation that will separate records to different HBase Regions based on region splits

    Annotations
    @Public()
  3. class ByteArrayComparable extends Comparable[ByteArrayComparable]

    Permalink
    Annotations
    @Public()
  4. class ByteArrayWrapper extends Comparable[ByteArrayWrapper] with Serializable

    Permalink

    This is a wrapper over a byte array so it can work as a key in a hashMap

    This is a wrapper over a byte array so it can work as a key in a hashMap

    Annotations
    @Public()
  5. class ColumnFamilyQualifierMapKeyWrapper extends Serializable

    Permalink

    A wrapper class that will allow both columnFamily and qualifier to be the key of a hashMap.

    A wrapper class that will allow both columnFamily and qualifier to be the key of a hashMap. Also allow for finding the value in a hashmap with out cloning the HBase value from the HBase Cell object

    Annotations
    @Public()
  6. class ColumnFilter extends Serializable

    Permalink

    Contains information related to a filters for a given column.

    Contains information related to a filters for a given column. This can contain many ranges or points.

    Annotations
    @Private()
  7. class ColumnFilterCollection extends AnyRef

    Permalink

    A collection of ColumnFilters indexed by column names.

    A collection of ColumnFilters indexed by column names.

    Also contains merge commends that will consolidate the filters per column name

    Annotations
    @Private()
  8. trait CompareTrait extends AnyRef

    Permalink
    Annotations
    @Private()
  9. class DefaultSource extends RelationProvider with DataSourceRegister with CreatableRelationProvider with StreamSinkProvider with Logging

    Permalink

    DefaultSource for integration with Spark's dataframe datasources.

    DefaultSource for integration with Spark's dataframe datasources. This class will produce a relationProvider based on input given to it from spark

    This class needs to stay in the current package 'org.apache.hadoop.hbase.spark' for Spark to match the hbase data source name.

    In all this DefaultSource support the following datasource functionality - Scan range pruning through filter push down logic based on rowKeys - Filter push down logic on HBase Cells - Qualifier filtering based on columns used in the SparkSQL statement - Type conversions of basic SQL types. All conversions will be Through the HBase Bytes object commands.

    Annotations
    @Private()
  10. trait DynamicLogicExpression extends AnyRef

    Permalink

    Dynamic logic for SQL push down logic there is an instance for most common operations and a pass through for other operations not covered here

    Dynamic logic for SQL push down logic there is an instance for most common operations and a pass through for other operations not covered here

    Logic can be nested with And or Or operators.

    A logic tree can be written out as a string and reconstructed from that string

    Annotations
    @Private()
  11. class EqualLogicExpression extends DynamicLogicExpression

    Permalink
    Annotations
    @Private()
  12. class ExecutionRuleForUnitTesting extends AnyRef

    Permalink
    Annotations
    @Private()
  13. class FamiliesQualifiersValues extends Serializable

    Permalink

    This object is a clean way to store and sort all cells that will be bulk loaded into a single row

    This object is a clean way to store and sort all cells that will be bulk loaded into a single row

    Annotations
    @Public()
  14. class FamilyHFileWriteOptions extends Serializable

    Permalink

    This object will hold optional data for how a given column family's writer will work

    This object will hold optional data for how a given column family's writer will work

    Annotations
    @Public()
  15. class GreaterThanLogicExpression extends DynamicLogicExpression with CompareTrait

    Permalink
    Annotations
    @Private()
  16. class GreaterThanOrEqualLogicExpression extends DynamicLogicExpression with CompareTrait

    Permalink
    Annotations
    @Private()
  17. case class HBaseConnectionCacheStat(numTotalRequests: Long, numActualConnectionsCreated: Long, numActiveConnections: Long) extends Product with Serializable

    Permalink

    To log the state of 'HBaseConnectionCache'

    To log the state of 'HBaseConnectionCache'

    numTotalRequests

    number of total connection requests to the cache

    numActualConnectionsCreated

    number of actual HBase connections the cache ever created

    numActiveConnections

    number of current alive HBase connections the cache is holding

  18. class HBaseConnectionKey extends Logging

    Permalink

    Denotes a unique key to an HBase Connection instance.

    Denotes a unique key to an HBase Connection instance. Please refer to 'org.apache.hadoop.hbase.client.HConnectionKey'.

    In essence, this class captures the properties in Configuration that may be used in the process of establishing a connection.

  19. class HBaseContext extends Serializable with Logging

    Permalink

    HBaseContext is a façade for HBase operations like bulk put, get, increment, delete, and scan

    HBaseContext is a façade for HBase operations like bulk put, get, increment, delete, and scan

    HBaseContext will take the responsibilities of disseminating the configuration information to the working and managing the life cycle of Connections.

    Annotations
    @Public()
  20. case class HBaseRelation(parameters: Map[String, String], userSpecifiedSchema: Option[StructType])(sqlContext: SQLContext) extends BaseRelation with PrunedFilteredScan with InsertableRelation with Logging with Product with Serializable

    Permalink

    Implementation of Spark BaseRelation that will build up our scan logic , do the scan pruning, filter push down, and value conversions

    Implementation of Spark BaseRelation that will build up our scan logic , do the scan pruning, filter push down, and value conversions

    sqlContext

    SparkSQL context

    Annotations
    @Private()
  21. class HBaseSink extends Sink with Logging

    Permalink

    Created by Agile Lab s.r.l.

    Created by Agile Lab s.r.l. on 04/11/2017.

  22. class IsNullLogicExpression extends DynamicLogicExpression

    Permalink
    Annotations
    @Private()
  23. class JavaHBaseContext extends Serializable

    Permalink

    This is the Java Wrapper over HBaseContext which is written in Scala.

    This is the Java Wrapper over HBaseContext which is written in Scala. This class will be used by developers that want to work with Spark or Spark Streaming in Java

    Annotations
    @Public()
  24. class KeyFamilyQualifier extends Comparable[KeyFamilyQualifier] with Serializable

    Permalink

    This is the key to be used for sorting and shuffling.

    This is the key to be used for sorting and shuffling.

    We will only partition on the rowKey but we will sort on all three

    Annotations
    @Public()
  25. class LessThanLogicExpression extends DynamicLogicExpression with CompareTrait

    Permalink
    Annotations
    @Private()
  26. class LessThanOrEqualLogicExpression extends DynamicLogicExpression with CompareTrait

    Permalink
    Annotations
    @Private()
  27. class NewHBaseRDD[K, V] extends NewHadoopRDD[K, V]

    Permalink
    Annotations
    @Public()
  28. class OrLogicExpression extends DynamicLogicExpression

    Permalink
    Annotations
    @Private()
  29. class PassThroughLogicExpression extends DynamicLogicExpression

    Permalink
    Annotations
    @Private()
  30. case class PutConverterFactory(catalog: HBaseTableCatalog, schema: StructType) extends Product with Serializable

    Permalink
  31. class RowKeyFilter extends Serializable

    Permalink

    Contains information related to a filters for a given column.

    Contains information related to a filters for a given column. This can contain many ranges or points.

    Annotations
    @Private()
  32. class ScanRange extends Serializable

    Permalink

    Construct to contain a single scan ranges information.

    Construct to contain a single scan ranges information. Also provide functions to merge with other scan ranges through AND or OR operators

    Annotations
    @Private()

Value Members

  1. object DefaultSourceStaticUtils

    Permalink

    Status object to store static functions but also to hold last executed information that can be used for unit testing.

    Status object to store static functions but also to hold last executed information that can be used for unit testing.

    Annotations
    @Private()
  2. object DynamicFieldStructure

    Permalink
  3. object DynamicLogicExpressionBuilder

    Permalink
    Annotations
    @Private()
  4. object HBaseDStreamFunctions

    Permalink

    HBaseDStreamFunctions contains a set of implicit functions that can be applied to a Spark DStream so that we can easily interact with HBase

    HBaseDStreamFunctions contains a set of implicit functions that can be applied to a Spark DStream so that we can easily interact with HBase

    Annotations
    @Public()
  5. object HBaseRDDFunctions

    Permalink

    HBaseRDDFunctions contains a set of implicit functions that can be applied to a Spark RDD so that we can easily interact with HBase

    HBaseRDDFunctions contains a set of implicit functions that can be applied to a Spark RDD so that we can easily interact with HBase

    Annotations
    @Public()
  6. object LatestHBaseContextCache

    Permalink
  7. object PutConverterFactory extends Serializable

    Permalink
  8. package datasources

    Permalink
  9. package example

    Permalink

Ungrouped