Package

com

sparkfits

Permalink

package sparkfits

Visibility
  1. Public
  2. All

Type Members

  1. class FitsRecordReader extends RecordReader[LongWritable, Seq[Row]]

    Permalink

    Class to handle the relationship between executors & HDFS when reading a FITS file: File -> InputSplit -> RecordReader (this class) -> Mapper (executors) It extends the abstract class RecordReader from Hadoop.

    Class to handle the relationship between executors & HDFS when reading a FITS file: File -> InputSplit -> RecordReader (this class) -> Mapper (executors) It extends the abstract class RecordReader from Hadoop. The idea behind is to describe the split of the FITS file in block and splits in HDFS. First the file is split into blocks in HDFS (physical blocks), whose size are given by Hadoop configuration (typically 128 MB). Then inside a block, the data is sent to executors record-by-record (logical split) of size < 128 MB. The purpose of this class is to describe the 2nd step, that is the split of blocks in records.

    The data is first read in chunks of binary data, then converted to the correct type element by element, and finally grouped into rows.

Value Members

  1. object FitsLib

    Permalink

    This is the beginning of a FITS library in Scala.

    This is the beginning of a FITS library in Scala. You will find a large number of methodes to manipulate Binary Table HDUs. There is no support for image HDU for the moment.

  2. object FitsSchema

    Permalink

    Object to handle the conversion from a HDU header to a DataFrame Schema.

  3. object ReadFits

    Permalink
  4. package fits

    Permalink

Ungrouped