package datasources
- Alphabetic
- By Inheritance
- datasources
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
abstract
class
AvroException extends Exception
- Annotations
- @Private()
-
case class
Bound(b: Array[Byte], inc: Boolean) extends Product with Serializable
The Bound represent the boudary for the scan
The Bound represent the boudary for the scan
- b
The byte array of the bound
- inc
inclusive or not.
- Annotations
- @Private()
-
case class
BoundRange(low: Array[Byte], upper: Array[Byte]) extends Product with Serializable
The ranges for the data type whose size is known.
The ranges for the data type whose size is known. Whether the bound is inclusive or exclusive is undefind, and upper to the caller to decide.
- Annotations
- @LimitedPrivate() @Evolving()
-
case class
BoundRanges(less: Array[BoundRange], greater: Array[BoundRange], value: Array[Byte]) extends Product with Serializable
The class identifies the ranges for a java primitive type.
The class identifies the ranges for a java primitive type. The caller needs to decide the bound is either inclusive or exclusive on its own. information
- Annotations
- @LimitedPrivate() @Evolving()
-
trait
BytesEncoder extends AnyRef
The trait to support plugin architecture for different encoder/decoder.
The trait to support plugin architecture for different encoder/decoder. encode is used for serializing the data type to byte array and the filter is used to filter out the unnecessary records.
- Annotations
- @LimitedPrivate() @Evolving()
- class DoubleSerDes extends SerDes
-
case class
GetResource(tbr: TableResource, rs: Array[Result]) extends Resource with Product with Serializable
- Annotations
- @Private()
-
class
HBaseTableScanRDD extends RDD[Result] with Logging
- Annotations
- @Private()
- type HBaseType = Array[Byte]
-
class
NaiveEncoder extends BytesEncoder with Logging
This is the naive non-order preserving encoder/decoder.
This is the naive non-order preserving encoder/decoder. Due to the inconsistency of the order between java primitive types and their bytearray. The data type has to be passed in so that the filter can work correctly, which is done by wrapping the type into the first byte of the serialized array.
- Annotations
- @Private()
- case class RDDResources(set: HashSet[Resource]) extends Product with Serializable
-
case class
Range(lower: Option[Bound], upper: Option[Bound]) extends Product with Serializable
- Annotations
- @Private()
-
trait
ReferencedResource extends AnyRef
- Annotations
- @Private()
-
case class
RegionResource(relation: HBaseRelation) extends ReferencedResource with Product with Serializable
- Annotations
- @Private()
-
trait
Resource extends AnyRef
- Annotations
- @Private()
-
case class
ScanResource(tbr: TableResource, rs: ResultScanner) extends Resource with Product with Serializable
- Annotations
- @Private()
-
case class
SchemaConversionException(msg: String) extends AvroException with Product with Serializable
- Annotations
- @Private()
- trait SerDes extends AnyRef
-
class
SerializableConfiguration extends Serializable
- Annotations
- @Private()
- case class SerializedFilter(b: Option[Array[Byte]]) extends Product with Serializable
-
case class
TableResource(relation: HBaseRelation) extends ReferencedResource with Product with Serializable
- Annotations
- @Private()
Value Members
- val ByteMax: Byte
- val ByteMin: Byte
- def bytesMax: Null
- def bytesMin: Array[Byte]
- val ord: Ordering[HBaseType]
- implicit val order: Ordering[HBaseType]
-
object
AvroSerdes
- Annotations
- @Private()
-
object
HBaseResources
- Annotations
- @Private()
-
object
HBaseSparkConf
This is the hbase configuration.
This is the hbase configuration. User can either set them in SparkConf, which will take effect globally, or configure it per table, which will overwrite the value set in SparkConf. If not set, the default value will take effect.
- Annotations
- @Public()
- object HBaseTableScanRDD extends Serializable
- object JavaBytesEncoder extends Enumeration with Logging
-
object
Points
- Annotations
- @Private()
-
object
Range extends Serializable
- Annotations
- @Private()
-
object
Ranges
- Annotations
- @Private()
-
object
SchemaConverters
* On top level, the converters provide three high level interface.
* On top level, the converters provide three high level interface. 1. toSqlType: This function takes an avro schema and returns a sql schema. 2. createConverterToSQL: Returns a function that is used to convert avro types to their corresponding sparkSQL representations. 3. convertTypeToAvro: This function constructs converter function for a given sparkSQL datatype. This is used in writing Avro records out to disk
- Annotations
- @Private()
- object SerializedFilter extends Serializable