All Classes and Interfaces

Class
Description
 
This class allows MapReduce jobs to write output in the Accumulo data file format.
Care should be taken to write only sorted data (sorted by Key), as this is an important requirement of Accumulo data files.
 
This class allows MapReduce jobs to use Accumulo as the source of data.
 
This class allows MapReduce jobs to use Accumulo as the sink for data.
 
An implementation of RecordReader that converts Accumulo Key/Value pairs to the user's K/V types.
A base class to be used to create RecordWriter instances that write to Accumulo.
A base class to be used to create RecordWriter instances that write to Accumulo.
 
This class allows MapReduce jobs to use Accumulo as the source of data.
The Class BatchInputSplit.
The Class BatchInputSplit.
 
 
Configuration keys for general configuration options.
 
Configuration keys for AccumuloConfiguration.
Builder for all the information needed for the Map Reduce job.
Options for builder
Required params for builder
 
 
Configuration keys for various features.
Configuration keys for Scanner.
Builder for all the information needed for the Map Reduce job.
Required params for builder
Optional values to set using fluent API
Required params for builder
 
This class to holds a batch scan configuration for a table.
Hadoop partitioner that uses ranges based on row keys, and optionally sub-bins based on hashing.
 
Configuration keys for various features.
Configuration keys for BatchWriter.
Builder for all the information needed for the Map Reduce job.
Required params for client
Builder options
 
The Class RangeInputSplit.
The Class RangeInputSplit.
Hadoop partitioner that uses ranges, and optionally sub-bins based on hashing.