The Bound represent the boudary for the scan
The Bound represent the boudary for the scan
The byte array of the bound
inclusive or not.
The ranges for the data type whose size is known.
The ranges for the data type whose size is known. Whether the bound is inclusive or exclusive is undefind, and upper to the caller to decide.
The class identifies the ranges for a java primitive type.
The class identifies the ranges for a java primitive type. The caller needs to decide the bound is either inclusive or exclusive on its own. information
The trait to support plugin architecture for different encoder/decoder.
The trait to support plugin architecture for different encoder/decoder. encode is used for serializing the data type to byte array and the filter is used to filter out the unnecessary records.
This is the naive non-order preserving encoder/decoder.
This is the naive non-order preserving encoder/decoder. Due to the inconsistency of the order between java primitive types and their bytearray. The data type has to be passed in so that the filter can work correctly, which is done by wrapping the type into the first byte of the serialized array.
This is the hbase configuration.
This is the hbase configuration. User can either set them in SparkConf, which will take effect globally, or configure it per table, which will overwrite the value set in SparkConf. If not set, the default value will take effect.
* On top level, the converters provide three high level interface.
* On top level, the converters provide three high level interface. 1. toSqlType: This function takes an avro schema and returns a sql schema. 2. createConverterToSQL: Returns a function that is used to convert avro types to their corresponding sparkSQL representations. 3. convertTypeToAvro: This function constructs converter function for a given sparkSQL datatype. This is used in writing Avro records out to disk