Construct a RDD[Point3D] from FITS data.
Construct a RDD[Point3D] from FITS data.
val fn = "src/test/resources/astro_obs.fits" val rdd = new Point3DRDDFromFITS(spark, fn, 1, "Z_COSMO,RA,Dec", true)
: (SparkSession) The spark session
: (String) File name where the data is stored
: (Int) HDU to load.
: (String) Comma-separated names of (x, y, z) columns. Example: "Z_COSMO,RA,Dec".
: (Boolean) If true, it assumes that the coordinates of the Point3D are (r, theta, phi). Otherwise, it assumes cartesian coordinates (x, y, z). Default is false.
(RDD[Point3D])
Construct a RDD[Point3D] from CSV, JSON or TXT data.
Construct a RDD[Point3D] from CSV, JSON or TXT data.
// CSV val fn = "src/test/resources/astro_obs.csv" val rdd = new Point3DRDD(spark, fn, "Z_COSMO,RA,Dec", true) // JSON val fn = "src/test/resources/astro_obs.json" val rdd = new Point3DRDD(spark, fn, "Z_COSMO,RA,Dec", true) // TXT val fn = "src/test/resources/astro_obs.txt" val rdd = new Point3DRDD(spark, fn, "Z_COSMO,RA,Dec", true)
: (SparkSession) The spark session
: (String) File name where the data is stored. Extension must be explicitly written (.cvs, .json, or .txt)
: (String) Comma-separated names of (x, y, z) columns. Example: "Z_COSMO,RA,Dec".
: (Boolean) If true, it assumes that the coordinates of the Point3D are (r, theta, phi). Otherwise, it assumes cartesian coordinates (x, y, z).
(RDD[Point3D])
Construct a RDD[ShellEnvelope] from FITS data.
Construct a RDD[ShellEnvelope] from FITS data.
val fn = "src/test/resources/cartesian_spheres.fits" val sphereRDD = new SphereRDD(spark, fn, 1, "x,y,z,radius", false)
: (SparkSession) The spark session
: (String) File name where the data is stored
: (Int) HDU to load.
: (String) Comma-separated names of (x, y, z, r) columns to read. Example: "Z_COSMO,RA,Dec,Radius".
: (Boolean) If true, it assumes that the coordinates of the center of the ShellEnvelope are (r, theta, phi). Otherwise, it assumes cartesian coordinates (x, y, z). Default is false.
(RDD[ShellEnvelope)
Construct a RDD[ShellEnvelope] from CSV, JSON or TXT data.
Construct a RDD[ShellEnvelope] from CSV, JSON or TXT data.
// CSV val fn = "src/test/resources/cartesian_spheres.csv" val rdd = new SphereRDD(spark, fn, "x,y,z,radius", false) // JSON val fn = "src/test/resources/cartesian_spheres.json" val rdd = new SphereRDD(spark, fn, "x,y,z,radius", false) // TXT val fn = "src/test/resources/cartesian_spheres.txt" val rdd = new SphereRDD(spark, fn, "x,y,z,radius", false)
: (SparkSession) The spark session
: (String) File name where the data is stored. Extension must be explicitly written (.cvs, .json, or .txt)
: (String) Comma-separated names of (x, y, z, r) columns to read. Example: "Z_COSMO,RA,Dec,Radius".
: (Boolean) If true, it assumes that the coordinates of the center of the ShellEnvelope are (r, theta, phi). Otherwise, it assumes cartesian coordinates (x, y, z). Default is false.
(RDD[ShellEnvelope])
Put here routine to load data for a specific data format Currently available: CSV, JSON, TXT, FITS