Check that the schemas of different FITS files to be added are the same.
Check that the schemas of different FITS files to be added are the same. Throw an AssertionError if not.
: (List[String]) List of files as a list of String.
Load recursively all FITS file inside a directory.
Load recursively all FITS file inside a directory.
: (RemoteIterator[LocatedFileStatus]) Iterator from a Hadoop Path containing informations about files.
: (List[String) List of accepted extensions. Currently only .fits is available. Default is List("*.fits").
List of files as a list of String.
Load the HDU data from several FITS file into a single DataFrame.
Load the HDU data from several FITS file into a single DataFrame. The structure of the HDU must be the same, that is contain the same number of columns with the same name and element types. The schema of the DataFrame is directly inferred from the header of the fits HDU.
: (List[String]) List of filenames with the same structure.
(DataFrame) always one single DataFrame made from the HDU of one FITS file, or from the same kind of HDU from several FITS file.
Create a DataFrame from the data of one HDU.
Create a DataFrame from the data of one HDU. The input can be either the path to one FITS file (path + filename), or the path to a directory containing FITS files. In the latter, the code will load all FITS files listed inside this directory and make the union of the HDU data. Needless to say that the FITS files must have the same structure, otherwise the union will be impossible. The format of the input must be a String with Hadoop format
The schema of the DataFrame is directly inferred from the header of the fits HDU.
: (String) Filename of the fits file to be read, or a directory containing FITS files with the same HDU structure.
(DataFrame) always one single DataFrame made from the HDU of one FITS file, or from the same kind of HDU from several FITS file.
Load a BinaryTableHDU data contained in one HDU as a DataFrame.
Load a BinaryTableHDU data contained in one HDU as a DataFrame. The schema of the DataFrame is directly inferred from the header of the fits HDU.
: (String) Path + filename of the fits file to be read.
: DataFrame made from one single HDU.
Adds an input options for reading the underlying data source.
Adds an input options for reading the underlying data source. (key, Double)
: (String) Name of the option
: (Double) Value of the option.
Adds an input options for reading the underlying data source.
Adds an input options for reading the underlying data source. (key, Long)
: (String) Name of the option
: (Long) Value of the option.
Adds an input options for reading the underlying data source.
Adds an input options for reading the underlying data source. (key, boolean)
: (String) Name of the option
: (Boolean) Value of the option.
Adds an input options for reading the underlying data source.
Adds an input options for reading the underlying data source.
In general you can set the following option(s): - option("HDU", <Int>) - option("datatype", <String>) - option("printHDUHeader", <Boolean>)
Note that values pass as Boolean, Long, or Double will be first converted to String and then decoded later on.
: (String) Name of the option
: (String) Value of the option.
Replace the current syntax in spark 2.X spark.read.format("fits") --> spark.readfits This is a hack to avoid touching DataFrameReader class, for which the constructor is private...
Replace the current syntax in spark 2.X spark.read.format("fits") --> spark.readfits This is a hack to avoid touching DataFrameReader class, for which the constructor is private... If you have a better idea, bug me!
FitsContext
Adds a schema to our data.
Adds a schema to our data. It will overwrite the inferred schema from the HDU header. Useful if the header is corrupted.
: (StructType)
The schema for the data (StructType(List(StructField))
)
return the FitsContext (to chain operations)
Adds a method,
fitsFile
, to SparkSession that allows reading FITS data. Note that for the moment, we provide support only for FITS table. We will add FITS image later on.The interpreter session below shows how to use basic functionalities: