package spark
- Alphabetic
- Public
- All
Type Members
-
class
AndLogicExpression extends DynamicLogicExpression
- Annotations
- @Private()
-
class
BulkLoadPartitioner extends Partitioner
A Partitioner implementation that will separate records to different HBase Regions based on region splits
A Partitioner implementation that will separate records to different HBase Regions based on region splits
- Annotations
- @Public()
-
class
ByteArrayComparable extends Comparable[ByteArrayComparable]
- Annotations
- @Public()
-
class
ByteArrayWrapper extends Comparable[ByteArrayWrapper] with Serializable
This is a wrapper over a byte array so it can work as a key in a hashMap
This is a wrapper over a byte array so it can work as a key in a hashMap
- Annotations
- @Public()
-
class
ColumnFamilyQualifierMapKeyWrapper extends Serializable
A wrapper class that will allow both columnFamily and qualifier to be the key of a hashMap.
A wrapper class that will allow both columnFamily and qualifier to be the key of a hashMap. Also allow for finding the value in a hashmap with out cloning the HBase value from the HBase Cell object
- Annotations
- @Public()
-
class
ColumnFilter extends Serializable
Contains information related to a filters for a given column.
Contains information related to a filters for a given column. This can contain many ranges or points.
- Annotations
- @Private()
-
class
ColumnFilterCollection extends AnyRef
A collection of ColumnFilters indexed by column names.
A collection of ColumnFilters indexed by column names.
Also contains merge commends that will consolidate the filters per column name
- Annotations
- @Private()
-
trait
CompareTrait extends AnyRef
- Annotations
- @Private()
-
class
DefaultSource extends RelationProvider with DataSourceRegister with CreatableRelationProvider with StreamSinkProvider with Logging
DefaultSource for integration with Spark's dataframe datasources.
DefaultSource for integration with Spark's dataframe datasources. This class will produce a relationProvider based on input given to it from spark
This class needs to stay in the current package 'org.apache.hadoop.hbase.spark' for Spark to match the hbase data source name.
In all this DefaultSource support the following datasource functionality - Scan range pruning through filter push down logic based on rowKeys - Filter push down logic on HBase Cells - Qualifier filtering based on columns used in the SparkSQL statement - Type conversions of basic SQL types. All conversions will be Through the HBase Bytes object commands.
- Annotations
- @Private()
-
trait
DynamicLogicExpression extends AnyRef
Dynamic logic for SQL push down logic there is an instance for most common operations and a pass through for other operations not covered here
Dynamic logic for SQL push down logic there is an instance for most common operations and a pass through for other operations not covered here
Logic can be nested with And or Or operators.
A logic tree can be written out as a string and reconstructed from that string
- Annotations
- @Private()
-
class
EqualLogicExpression extends DynamicLogicExpression
- Annotations
- @Private()
-
class
ExecutionRuleForUnitTesting extends AnyRef
- Annotations
- @Private()
-
class
FamiliesQualifiersValues extends Serializable
This object is a clean way to store and sort all cells that will be bulk loaded into a single row
This object is a clean way to store and sort all cells that will be bulk loaded into a single row
- Annotations
- @Public()
-
class
FamilyHFileWriteOptions extends Serializable
This object will hold optional data for how a given column family's writer will work
This object will hold optional data for how a given column family's writer will work
- Annotations
- @Public()
-
class
GreaterThanLogicExpression extends DynamicLogicExpression with CompareTrait
- Annotations
- @Private()
-
class
GreaterThanOrEqualLogicExpression extends DynamicLogicExpression with CompareTrait
- Annotations
- @Private()
-
case class
HBaseConnectionCacheStat(numTotalRequests: Long, numActualConnectionsCreated: Long, numActiveConnections: Long) extends Product with Serializable
To log the state of 'HBaseConnectionCache'
To log the state of 'HBaseConnectionCache'
- numTotalRequests
number of total connection requests to the cache
- numActualConnectionsCreated
number of actual HBase connections the cache ever created
- numActiveConnections
number of current alive HBase connections the cache is holding
-
class
HBaseConnectionKey extends Logging
Denotes a unique key to an HBase Connection instance.
Denotes a unique key to an HBase Connection instance. Please refer to 'org.apache.hadoop.hbase.client.HConnectionKey'.
In essence, this class captures the properties in Configuration that may be used in the process of establishing a connection.
-
class
HBaseContext extends Serializable with Logging
HBaseContext is a façade for HBase operations like bulk put, get, increment, delete, and scan
HBaseContext is a façade for HBase operations like bulk put, get, increment, delete, and scan
HBaseContext will take the responsibilities of disseminating the configuration information to the working and managing the life cycle of Connections.
- Annotations
- @Public()
-
case class
HBaseRelation(parameters: Map[String, String], userSpecifiedSchema: Option[StructType])(sqlContext: SQLContext) extends BaseRelation with PrunedFilteredScan with InsertableRelation with Logging with Product with Serializable
Implementation of Spark BaseRelation that will build up our scan logic , do the scan pruning, filter push down, and value conversions
Implementation of Spark BaseRelation that will build up our scan logic , do the scan pruning, filter push down, and value conversions
- sqlContext
SparkSQL context
- Annotations
- @Private()
-
class
HBaseSink extends Sink with Logging
Created by Agile Lab s.r.l.
Created by Agile Lab s.r.l. on 04/11/2017.
-
class
IsNullLogicExpression extends DynamicLogicExpression
- Annotations
- @Private()
-
class
JavaHBaseContext extends Serializable
This is the Java Wrapper over HBaseContext which is written in Scala.
This is the Java Wrapper over HBaseContext which is written in Scala. This class will be used by developers that want to work with Spark or Spark Streaming in Java
- Annotations
- @Public()
-
class
KeyFamilyQualifier extends Comparable[KeyFamilyQualifier] with Serializable
This is the key to be used for sorting and shuffling.
This is the key to be used for sorting and shuffling.
We will only partition on the rowKey but we will sort on all three
- Annotations
- @Public()
-
class
LessThanLogicExpression extends DynamicLogicExpression with CompareTrait
- Annotations
- @Private()
-
class
LessThanOrEqualLogicExpression extends DynamicLogicExpression with CompareTrait
- Annotations
- @Private()
-
class
NewHBaseRDD[K, V] extends NewHadoopRDD[K, V]
- Annotations
- @Public()
-
class
OrLogicExpression extends DynamicLogicExpression
- Annotations
- @Private()
-
class
PassThroughLogicExpression extends DynamicLogicExpression
- Annotations
- @Private()
- case class PutConverterFactory(catalog: HBaseTableCatalog, schema: StructType) extends Product with Serializable
-
class
RowKeyFilter extends Serializable
Contains information related to a filters for a given column.
Contains information related to a filters for a given column. This can contain many ranges or points.
- Annotations
- @Private()
-
class
ScanRange extends Serializable
Construct to contain a single scan ranges information.
Construct to contain a single scan ranges information. Also provide functions to merge with other scan ranges through AND or OR operators
- Annotations
- @Private()
Value Members
-
object
DefaultSourceStaticUtils
Status object to store static functions but also to hold last executed information that can be used for unit testing.
Status object to store static functions but also to hold last executed information that can be used for unit testing.
- Annotations
- @Private()
- object DynamicFieldStructure
-
object
DynamicLogicExpressionBuilder
- Annotations
- @Private()
-
object
HBaseRDDFunctions
HBaseRDDFunctions contains a set of implicit functions that can be applied to a Spark RDD so that we can easily interact with HBase
HBaseRDDFunctions contains a set of implicit functions that can be applied to a Spark RDD so that we can easily interact with HBase
- Annotations
- @Public()
- object LatestHBaseContextCache
- object PutConverterFactory extends Serializable