org.apache.hadoop.hbase.io.encoding
Class EncodedDataBlock

java.lang.Object
  extended by org.apache.hadoop.hbase.io.encoding.EncodedDataBlock

@InterfaceAudience.Private
public class EncodedDataBlock
extends Object

Encapsulates a data block compressed using a particular encoding algorithm. Useful for testing and benchmarking.


Constructor Summary
EncodedDataBlock(DataBlockEncoder dataBlockEncoder, DataBlockEncoding encoding, byte[] rawKVs, HFileContext meta)
          Create a buffer which will be encoded using dataBlockEncoder.
 
Method Summary
 byte[] encodeData()
          Do the encoding, but do not cache the encoded data.
static int getCompressedSize(Compression.Algorithm algo, org.apache.hadoop.io.compress.Compressor compressor, byte[] inputBuffer, int offset, int length)
          Find the size of compressed data assuming that buffer will be compressed using given algorithm.
 int getEncodedCompressedSize(Compression.Algorithm comprAlgo, org.apache.hadoop.io.compress.Compressor compressor)
          Estimate size after second stage of compression (e.g.
 Iterator<Cell> getIterator(int headerSize)
          Provides access to compressed value.
 int getSize()
          Find the size of minimal buffer that could store compressed data.
 String toString()
           
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
 

Constructor Detail

EncodedDataBlock

public EncodedDataBlock(DataBlockEncoder dataBlockEncoder,
                        DataBlockEncoding encoding,
                        byte[] rawKVs,
                        HFileContext meta)
Create a buffer which will be encoded using dataBlockEncoder.

Parameters:
dataBlockEncoder - Algorithm used for compression.
encoding - encoding type used
rawKVs -
meta -
Method Detail

getIterator

public Iterator<Cell> getIterator(int headerSize)
Provides access to compressed value.

Parameters:
headerSize - header size of the block.
Returns:
Forwards sequential iterator.

getSize

public int getSize()
Find the size of minimal buffer that could store compressed data.

Returns:
Size in bytes of compressed data.

getCompressedSize

public static int getCompressedSize(Compression.Algorithm algo,
                                    org.apache.hadoop.io.compress.Compressor compressor,
                                    byte[] inputBuffer,
                                    int offset,
                                    int length)
                             throws IOException
Find the size of compressed data assuming that buffer will be compressed using given algorithm.

Parameters:
algo - compression algorithm
compressor - compressor already requested from codec
inputBuffer - Array to be compressed.
offset - Offset to beginning of the data.
length - Length to be compressed.
Returns:
Size of compressed data in bytes.
Throws:
IOException

getEncodedCompressedSize

public int getEncodedCompressedSize(Compression.Algorithm comprAlgo,
                                    org.apache.hadoop.io.compress.Compressor compressor)
                             throws IOException
Estimate size after second stage of compression (e.g. LZO).

Parameters:
comprAlgo - compression algorithm to be used for compression
compressor - compressor corresponding to the given compression algorithm
Returns:
Size after second stage of compression.
Throws:
IOException

encodeData

public byte[] encodeData()
Do the encoding, but do not cache the encoded data.

Returns:
encoded data block with header and checksum

toString

public String toString()
Overrides:
toString in class Object


Copyright © 2015 The Apache Software Foundation. All Rights Reserved.