Class/Object

io.smartdatalake.workflow.dataobject

RawFileDataObject

Related Docs: object RawFileDataObject | package dataobject

Permalink

case class RawFileDataObject(id: DataObjectId, path: String, fileName: String = "*", partitions: Seq[String] = Seq(), saveMode: SaveMode = SaveMode.Overwrite, acl: Option[AclDef] = None, connectionId: Option[ConnectionId] = None, expectedPartitionsCondition: Option[String] = None, metadata: Option[DataObjectMetadata] = None)(implicit instanceRegistry: InstanceRegistry) extends HadoopFileDataObject with Product with Serializable

DataObject of type raw for files with unknown content. Provides details to an Action to access raw files.

fileName

Definition of fileName. This is concatenated with path and partition layout to search for files. Default is an asterix to match everything.

saveMode

Overwrite or Append new data.

expectedPartitionsCondition

Optional definition of partitions expected to exist. Define a Spark SQL expression that is evaluated against a PartitionValues instance and returns true or false Default is to expect all partitions to exist.

Linear Supertypes
Serializable, Serializable, Product, Equals, HadoopFileDataObject, CanCreateOutputStream, CanCreateInputStream, FileRefDataObject, FileDataObject, CanHandlePartitions, DataObject, SmartDataLakeLogger, ParsableFromConfig[DataObject], SdlConfigObject, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. RawFileDataObject
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. HadoopFileDataObject
  7. CanCreateOutputStream
  8. CanCreateInputStream
  9. FileRefDataObject
  10. FileDataObject
  11. CanHandlePartitions
  12. DataObject
  13. SmartDataLakeLogger
  14. ParsableFromConfig
  15. SdlConfigObject
  16. AnyRef
  17. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new RawFileDataObject(id: DataObjectId, path: String, fileName: String = "*", partitions: Seq[String] = Seq(), saveMode: SaveMode = SaveMode.Overwrite, acl: Option[AclDef] = None, connectionId: Option[ConnectionId] = None, expectedPartitionsCondition: Option[String] = None, metadata: Option[DataObjectMetadata] = None)(implicit instanceRegistry: InstanceRegistry)

    Permalink

    fileName

    Definition of fileName. This is concatenated with path and partition layout to search for files. Default is an asterix to match everything.

    saveMode

    Overwrite or Append new data.

    expectedPartitionsCondition

    Optional definition of partitions expected to exist. Define a Spark SQL expression that is evaluated against a PartitionValues instance and returns true or false Default is to expect all partitions to exist.

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. val acl: Option[AclDef]

    Permalink

    Return the ACL definition for the Hadoop path of this DataObject

    Return the ACL definition for the Hadoop path of this DataObject

    Definition Classes
    RawFileDataObject → HadoopFileDataObject
    See also

    org.apache.hadoop.fs.permission.AclEntry

  5. def applyAcls(implicit session: SparkSession): Unit

    Permalink
    Attributes
    protected[io.smartdatalake.workflow]
    Definition Classes
    HadoopFileDataObject
  6. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  7. def checkFilesExisting(implicit session: SparkSession): Boolean

    Permalink

    Check if the input files exist.

    Check if the input files exist.

    Attributes
    protected
    Definition Classes
    HadoopFileDataObject
    Exceptions thrown

    IllegalArgumentException if failIfFilesMissing = true and no files found at path.

  8. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  9. val connection: Option[HadoopFileConnection]

    Permalink
    Attributes
    protected
    Definition Classes
    HadoopFileDataObject
  10. val connectionId: Option[ConnectionId]

    Permalink

    Return the connection id.

    Return the connection id.

    Connection defines path prefix (scheme, authority, base path) and ACL's in central location.

    Definition Classes
    RawFileDataObject → HadoopFileDataObject
  11. def createEmptyPartition(partitionValues: PartitionValues)(implicit session: SparkSession): Unit

    Permalink

    create empty partition

    create empty partition

    Definition Classes
    HadoopFileDataObject → CanHandlePartitions
  12. def createInputStream(path: String)(implicit session: SparkSession): InputStream

    Permalink
    Definition Classes
    HadoopFileDataObject → CanCreateInputStream
  13. final def createMissingPartitions(partitionValues: Seq[PartitionValues])(implicit session: SparkSession): Unit

    Permalink

    Create empty partitions for partition values not yet existing

    Create empty partitions for partition values not yet existing

    Definition Classes
    CanHandlePartitions
  14. def createOutputStream(path: String, overwrite: Boolean)(implicit session: SparkSession): OutputStream

    Permalink
    Definition Classes
    HadoopFileDataObject → CanCreateOutputStream
  15. def deleteAll(implicit session: SparkSession): Unit

    Permalink

    Delete all data.

    Delete all data. This is used to implement SaveMode.Overwrite.

    Definition Classes
    HadoopFileDataObject → FileRefDataObject
  16. def deleteFileRefs(fileRefs: Seq[FileRef])(implicit session: SparkSession): Unit

    Permalink

    Delete given files.

    Delete given files. This is used to cleanup files after they are processed.

    Definition Classes
    HadoopFileDataObject → FileRefDataObject
  17. def deletePartitions(partitionValues: Seq[PartitionValues])(implicit session: SparkSession): Unit

    Permalink

    Delete Hadoop Partitions.

    Delete Hadoop Partitions.

    Note that this is only possible, if every set of column names in partitionValues are valid "inits" of this DataObject's partitions.

    Every valid "init" can be produced by repeatedly removing the last element of a collection. Example: - a,b of a,b,c -> OK - a,c of a,b,c -> NOK

    Definition Classes
    HadoopFileDataObject → CanHandlePartitions
    See also

    scala.collection.TraversableLike.init

  18. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  19. val expectedPartitionsCondition: Option[String]

    Permalink

    Optional definition of partitions expected to exist.

    Optional definition of partitions expected to exist. Define a Spark SQL expression that is evaluated against a PartitionValues instance and returns true or false Default is to expect all partitions to exist.

    Definition Classes
    RawFileDataObject → CanHandlePartitions
  20. def extractPartitionValuesFromPath(filePath: String): PartitionValues

    Permalink

    Extract partition values from a given file path

    Extract partition values from a given file path

    Attributes
    protected
    Definition Classes
    FileRefDataObject
  21. def factory: FromConfigFactory[DataObject]

    Permalink

    Returns the factory that can parse this type (that is, type CO).

    Returns the factory that can parse this type (that is, type CO).

    Typically, implementations of this method should return the companion object of the implementing class. The companion object in turn should implement FromConfigFactory.

    returns

    the factory (object) for this class.

    Definition Classes
    RawFileDataObject → ParsableFromConfig
  22. def failIfFilesMissing: Boolean

    Permalink

    Configure whether io.smartdatalake.workflow.action.Actions should fail if the input file(s) are missing on the file system.

    Configure whether io.smartdatalake.workflow.action.Actions should fail if the input file(s) are missing on the file system.

    Default is false.

    Definition Classes
    HadoopFileDataObject
  23. val fileName: String

    Permalink

    Definition of fileName.

    Definition of fileName. This is concatenated with path and partition layout to search for files. Default is an asterix to match everything.

    Definition Classes
    RawFileDataObject → FileRefDataObject
  24. def filesystem(implicit session: SparkSession): FileSystem

    Permalink

    Create a hadoop FileSystem API handle for the provided SparkSession.

    Create a hadoop FileSystem API handle for the provided SparkSession.

    Definition Classes
    HadoopFileDataObject
  25. final def filterExpectedPartitionValues(partitionValues: Seq[PartitionValues])(implicit session: SparkSession): Seq[PartitionValues]

    Permalink

    Filter list of partition values by expected partitions condition

    Filter list of partition values by expected partitions condition

    Definition Classes
    CanHandlePartitions
  26. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  27. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  28. def getConnection[T <: Connection](connectionId: ConnectionId)(implicit registry: InstanceRegistry, ct: ClassTag[T], tt: scala.reflect.api.JavaUniverse.TypeTag[T]): T

    Permalink

    Handle class cast exception when getting objects from instance registry

    Handle class cast exception when getting objects from instance registry

    Attributes
    protected
    Definition Classes
    DataObject
  29. def getConnectionReg[T <: Connection](connectionId: ConnectionId, registry: InstanceRegistry)(implicit ct: ClassTag[T], tt: scala.reflect.api.JavaUniverse.TypeTag[T]): T

    Permalink
    Attributes
    protected
    Definition Classes
    DataObject
  30. def getFileRefs(partitionValues: Seq[PartitionValues])(implicit session: SparkSession): Seq[FileRef]

    Permalink

    List files for given partition values

    List files for given partition values

    partitionValues

    List of partition values to be filtered. If empty all files in root path of DataObject will be listed.

    returns

    List of FileRefs

    Definition Classes
    HadoopFileDataObject → FileRefDataObject
  31. def getPartitionString(partitionValues: PartitionValues)(implicit session: SparkSession): Option[String]

    Permalink

    get partition values formatted by partition layout

    get partition values formatted by partition layout

    Definition Classes
    FileRefDataObject
  32. def getPath: String

    Permalink

    Method for subclasses to override the base path for this DataObject.

    Method for subclasses to override the base path for this DataObject. This is for instance needed if pathPrefix is defined in a connection.

    Definition Classes
    HadoopFileDataObject → FileRefDataObject
  33. def getSearchPaths(partitionValues: Seq[PartitionValues])(implicit session: SparkSession): Seq[(PartitionValues, String)]

    Permalink

    prepare paths to be searched

    prepare paths to be searched

    Attributes
    protected
    Definition Classes
    FileRefDataObject
  34. val id: DataObjectId

    Permalink

    A unique identifier for this instance.

    A unique identifier for this instance.

    Definition Classes
    RawFileDataObject → DataObject → SdlConfigObject
  35. implicit val instanceRegistry: InstanceRegistry

    Permalink

    Return the InstanceRegistry parsed from the SDL configuration used for this run.

    Return the InstanceRegistry parsed from the SDL configuration used for this run.

    returns

    the current InstanceRegistry.

    Definition Classes
    RawFileDataObject → HadoopFileDataObject
  36. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  37. def listPartitions(implicit session: SparkSession): Seq[PartitionValues]

    Permalink

    List partitions on data object's root path

    List partitions on data object's root path

    Definition Classes
    HadoopFileDataObject → CanHandlePartitions
  38. lazy val logger: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    SmartDataLakeLogger
  39. val metadata: Option[DataObjectMetadata]

    Permalink

    Additional metadata for the DataObject

    Additional metadata for the DataObject

    Definition Classes
    RawFileDataObject → DataObject
  40. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  41. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  42. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  43. final def partitionLayout(): Option[String]

    Permalink

    Return a String specifying the partition layout.

    Return a String specifying the partition layout.

    For Hadoop the default partition layout is colname1=<value1>/colname2=<value2>/.../

    Definition Classes
    HadoopFileDataObject → FileRefDataObject
  44. val partitions: Seq[String]

    Permalink

    Definition of partition columns

    Definition of partition columns

    Definition Classes
    RawFileDataObject → CanHandlePartitions
  45. val path: String

    Permalink

    The root path of the files that are handled by this DataObject.

    The root path of the files that are handled by this DataObject.

    Definition Classes
    RawFileDataObject → FileDataObject
  46. def postRead(partitionValues: Seq[PartitionValues])(implicit session: SparkSession, context: ActionPipelineContext): Unit

    Permalink

    Runs operations after reading from DataObject

    Runs operations after reading from DataObject

    Definition Classes
    DataObject
  47. def postWrite(partitionValues: Seq[PartitionValues])(implicit session: SparkSession, context: ActionPipelineContext): Unit

    Permalink

    Runs operations after writing to DataObject

    Runs operations after writing to DataObject

    Definition Classes
    HadoopFileDataObject → DataObject
  48. def preRead(partitionValues: Seq[PartitionValues])(implicit session: SparkSession, context: ActionPipelineContext): Unit

    Permalink

    Runs operations before reading from DataObject

    Runs operations before reading from DataObject

    Definition Classes
    DataObject
  49. def preWrite(implicit session: SparkSession, context: ActionPipelineContext): Unit

    Permalink

    Runs operations before writing to DataObject Note: As the transformed SubFeed doesnt yet exist in Action.preWrite, no partition values can be passed as parameters as in preRead

    Runs operations before writing to DataObject Note: As the transformed SubFeed doesnt yet exist in Action.preWrite, no partition values can be passed as parameters as in preRead

    Definition Classes
    HadoopFileDataObject → DataObject
  50. def prepare(implicit session: SparkSession): Unit

    Permalink

    Prepare & test DataObject's prerequisits

    Prepare & test DataObject's prerequisits

    This runs during the "prepare" operation of the DAG.

    Definition Classes
    FileDataObject → DataObject
  51. val saveMode: SaveMode

    Permalink

    Overwrite or Append new data.

    Overwrite or Append new data.

    Definition Classes
    RawFileDataObject → FileRefDataObject
  52. val separator: Char

    Permalink

    default separator for paths

    default separator for paths

    Attributes
    protected
    Definition Classes
    FileDataObject
  53. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  54. def toStringShort: String

    Permalink
    Definition Classes
    DataObject
  55. def translateFileRefs(fileRefs: Seq[FileRef])(implicit session: SparkSession): Seq[FileRef]

    Permalink

    Given some FileRefs for another DataObject, translate the paths to the root path of this DataObject

    Given some FileRefs for another DataObject, translate the paths to the root path of this DataObject

    Definition Classes
    FileRefDataObject
  56. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  57. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  58. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from HadoopFileDataObject

Inherited from CanCreateOutputStream

Inherited from CanCreateInputStream

Inherited from FileRefDataObject

Inherited from FileDataObject

Inherited from CanHandlePartitions

Inherited from DataObject

Inherited from SmartDataLakeLogger

Inherited from ParsableFromConfig[DataObject]

Inherited from SdlConfigObject

Inherited from AnyRef

Inherited from Any

Ungrouped