Class

com.sparkfits.FitsLib

Fits

Related Doc: package FitsLib

Permalink

class Fits extends AnyRef

Main class to handle a HDU of a fits file. Main features are

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Fits
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Fits(hdfsPath: Path, conf: Configuration, hduIndex: Int)

    Permalink

    hdfsPath

    : (Path) Hadoop path containing informations on the file to read.

    conf

    : (Configuration) Hadoop configuration containing informations on the run.

    hduIndex

    : (Int) Index of the HDU to read (zero-based).

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. val blockBoundaries: FitsBlockBoundaries

    Permalink
  6. val blockHeader: Array[String]

    Permalink
  7. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. val data: FSDataInputStream

    Permalink
  9. val empty_hdu: Boolean

    Permalink
  10. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. val fs: FileSystem

    Permalink
  14. def getBlockBoundaries: FitsBlockBoundaries

    Permalink

    Compute the indices of the first and last bytes of the HDU: hdu_start=header_start, data_start, data_stop, hdu_stop

    Compute the indices of the first and last bytes of the HDU: hdu_start=header_start, data_start, data_stop, hdu_stop

    returns

    (FitsBlockBoundaries), Instance of FitsBlockBoundaries initialised with the boundaries of the FITS HDU (header+data).

  15. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  16. def getDataLen(keyValues: Map[String, String]): Long

    Permalink

    Compute the size of a data block.

    Compute the size of a data block.

    keyValues

    : (Map[String, String]) Values from the header (see parseHeader)

    returns

    (Long) Size of the corresponding data block

  17. def getHduType: String

    Permalink

    Check the type of HDU.

    Check the type of HDU. Available: BINTABLE, IMAGE, or EMPTY. If not registered, returns NOT UNDERSTOOD. Note: Not working if an image is stored in a primary HDU... TBD.

    returns

    (String) The kind of HDU data.

  18. def getHeaderComments(header: Array[String]): Map[String, String]

    Permalink

    Get the comments of the header.

    Get the comments of the header. We assume the comments are written after a backslash (\).

    header

    : (Array[String]) The header of the HDU.

    returns

    (Map[String, String]), a map of keyword/comment.

  19. def getNHDU: Int

    Permalink

    Return the number of HDUs in the FITS file.

    Return the number of HDUs in the FITS file.

    returns

    (Int) the number of HDU.

  20. def getRow(buf: Array[Byte]): List[_]

    Permalink

    Convert binary row into row.

    Convert binary row into row. You need to have the cursor at the beginning of a row. Example

    // Set the cursor at the beginning of the data block
    setCursor(BlockBoundaries.dataStart)
    // Initialise your binary row
    val buffer = Array[Byte](size_of_one_row_in_bytes)
    // Read the first binary row into buffer
    data.read(buffer, 0, size_of_one_row_in_bytes)
    // Convert buffer
    val myrow = getRow(buffer)
    buf

    : (Array[Byte]) Row of bytes read from the data block.

    returns

    (List[_]) The row as list of elements (float, int, string, etc.) with types as given by the header.

  21. def handleBintable: BintableHDU

    Permalink

    Give access to methods concerning BinTable HDU.

    Give access to methods concerning BinTable HDU.

    returns

    (BintableHDU)

  22. def handleImage: ImageHDU

    Permalink

    Give access to methods concerning Image HDU.

    Give access to methods concerning Image HDU.

    returns

    (ImageHDU)

  23. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  24. val hdu: HDU

    Permalink
  25. val hduType: String

    Permalink
  26. val isHDUBelowMax: Boolean

    Permalink
  27. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  28. val key: String

    Permalink
  29. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  30. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  31. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  32. val numberOfHdus: Int

    Permalink
  33. def readFullHeaderBlocks: Array[String]

    Permalink

    Read all header blocks of a HDU.

    Read all header blocks of a HDU. The cursor needs to be at the start of the header.

    returns

    (Array[String) the header is an array of Strings, each String being one line of the header.

  34. def readHeaderBlock: Array[String]

    Permalink

    Read one header block of a HDU.

    Read one header block of a HDU. The cursor needs to be at the start of the block. We assume that each header row has a standard size of 80 Bytes, and the total size of the header is 2880 Bytes.

    returns

    (Array[String) the header block is an array of Strings, each String being one line of the header.

  35. def registerHeader: Unit

    Permalink

    Register the header in the Hadoop configuration.

    Register the header in the Hadoop configuration. By doing this, we broadcast the header to the executors. The header is sent as a long String, and can be read properly afterwards using retrieveHeader. Make sure you use the same separators.

  36. def resetCursorAtData: Unit

    Permalink

    Place the cursor at the beginning of the data of the block

  37. def resetCursorAtHeader: Unit

    Permalink

    Place the cursor at the beginning of the header of the block

  38. def retrieveHeader: Array[String]

    Permalink

    Retrieve the header from the Hadoop configuration.

    Retrieve the header from the Hadoop configuration. Make sure you use the same separators as in registerHeader.

    returns

    the header as Array[String]. See readHeader.

  39. def setCursor(position: Long): Unit

    Permalink

    Set the cursor at the position (byte index, Long).

    Set the cursor at the position (byte index, Long).

    position

    : (Long) The byte index to seek in the file.

  40. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  41. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  42. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  43. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped