Class/Object

io.smartdatalake.workflow.dataobject

SFtpFileRefDataObject

Related Docs: object SFtpFileRefDataObject | package dataobject

Permalink

case class SFtpFileRefDataObject(id: DataObjectId, path: String, connectionId: ConnectionId, partitions: Seq[String] = Seq(), partitionLayout: Option[String] = None, saveMode: SaveMode = SaveMode.Overwrite, expectedPartitionsCondition: Option[String] = None, metadata: Option[DataObjectMetadata] = None)(implicit instanceRegistry: InstanceRegistry) extends FileRefDataObject with CanCreateInputStream with CanCreateOutputStream with SmartDataLakeLogger with Product with Serializable

Connects to SFtp files Needs java library "com.hieronymus % sshj % 0.21.1" The following authentication mechanisms are supported -> public/private-key: private key must be saved in ~/.ssh, public key must be registered on server. -> user/pwd authentication: user and password is taken from two variables set as parameters. These variables could come from clear text (CLEAR), a file (FILE) or an environment variable (ENV)

partitionLayout

partition layout defines how partition values can be extracted from the path. Use "%<colname>%" as token to extract the value for a partition column. With "%<colname:regex>%" a regex can be given to limit search. This is especially useful if there is no char to delimit the last token from the rest of the path or also between two tokens.

saveMode

Overwrite or Append new data.

expectedPartitionsCondition

Optional definition of partitions expected to exist. Define a Spark SQL expression that is evaluated against a PartitionValues instance and returns true or false Default is to expect all partitions to exist.

Linear Supertypes
Serializable, Serializable, Product, Equals, CanCreateOutputStream, CanCreateInputStream, FileRefDataObject, FileDataObject, CanHandlePartitions, DataObject, SmartDataLakeLogger, ParsableFromConfig[DataObject], SdlConfigObject, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SFtpFileRefDataObject
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. CanCreateOutputStream
  7. CanCreateInputStream
  8. FileRefDataObject
  9. FileDataObject
  10. CanHandlePartitions
  11. DataObject
  12. SmartDataLakeLogger
  13. ParsableFromConfig
  14. SdlConfigObject
  15. AnyRef
  16. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SFtpFileRefDataObject(id: DataObjectId, path: String, connectionId: ConnectionId, partitions: Seq[String] = Seq(), partitionLayout: Option[String] = None, saveMode: SaveMode = SaveMode.Overwrite, expectedPartitionsCondition: Option[String] = None, metadata: Option[DataObjectMetadata] = None)(implicit instanceRegistry: InstanceRegistry)

    Permalink

    partitionLayout

    partition layout defines how partition values can be extracted from the path. Use "%<colname>%" as token to extract the value for a partition column. With "%<colname:regex>%" a regex can be given to limit search. This is especially useful if there is no char to delimit the last token from the rest of the path or also between two tokens.

    saveMode

    Overwrite or Append new data.

    expectedPartitionsCondition

    Optional definition of partitions expected to exist. Define a Spark SQL expression that is evaluated against a PartitionValues instance and returns true or false Default is to expect all partitions to exist.

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. val connectionId: ConnectionId

    Permalink
  7. def createInputStream(path: String)(implicit session: SparkSession): InputStream

    Permalink
    Definition Classes
    SFtpFileRefDataObject → CanCreateInputStream
  8. def createOutputStream(path: String, overwrite: Boolean)(implicit session: SparkSession): OutputStream

    Permalink
    Definition Classes
    SFtpFileRefDataObject → CanCreateOutputStream
  9. def deleteAll(implicit session: SparkSession): Unit

    Permalink

    Delete all data.

    Delete all data. This is used to implement SaveMode.Overwrite.

    Definition Classes
    FileRefDataObject
  10. def deleteFileRefs(fileRefs: Seq[FileRef])(implicit session: SparkSession): Unit

    Permalink

    Delete given files.

    Delete given files. This is used to cleanup files after they are processed.

    Definition Classes
    SFtpFileRefDataObject → FileRefDataObject
  11. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  12. val expectedPartitionsCondition: Option[String]

    Permalink

    Optional definition of partitions expected to exist.

    Optional definition of partitions expected to exist. Define a Spark SQL expression that is evaluated against a PartitionValues instance and returns true or false Default is to expect all partitions to exist.

    Definition Classes
    SFtpFileRefDataObjectCanHandlePartitions
  13. def extractPartitionValuesFromPath(filePath: String): PartitionValues

    Permalink

    Extract partition values from a given file path

    Extract partition values from a given file path

    Attributes
    protected
    Definition Classes
    FileRefDataObject
  14. def factory: FromConfigFactory[DataObject]

    Permalink

    Returns the factory that can parse this type (that is, type CO).

    Returns the factory that can parse this type (that is, type CO).

    Typically, implementations of this method should return the companion object of the implementing class. The companion object in turn should implement FromConfigFactory.

    returns

    the factory (object) for this class.

    Definition Classes
    SFtpFileRefDataObject → ParsableFromConfig
  15. val fileName: String

    Permalink

    Definition of fileName.

    Definition of fileName. Default is an asterix to match everything. This is concatenated with the partition layout to search for files.

    Definition Classes
    FileRefDataObject
  16. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  17. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  18. def getConnection[T <: Connection](connectionId: ConnectionId)(implicit registry: InstanceRegistry, ct: ClassTag[T], tt: scala.reflect.api.JavaUniverse.TypeTag[T]): T

    Permalink

    Handle class cast exception when getting objects from instance registry

    Handle class cast exception when getting objects from instance registry

    Attributes
    protected
    Definition Classes
    DataObject
  19. def getConnectionReg[T <: Connection](connectionId: ConnectionId, registry: InstanceRegistry)(implicit ct: ClassTag[T], tt: scala.reflect.api.JavaUniverse.TypeTag[T]): T

    Permalink
    Attributes
    protected
    Definition Classes
    DataObject
  20. def getFileRefs(partitionValues: Seq[PartitionValues])(implicit session: SparkSession): Seq[FileRef]

    Permalink

    List files for given partition values

    List files for given partition values

    partitionValues

    List of partition values to be filtered. If empty all files in root path of DataObject will be listed.

    returns

    List of FileRefs

    Definition Classes
    SFtpFileRefDataObject → FileRefDataObject
  21. def getPartitionString(partitionValues: PartitionValues)(implicit session: SparkSession): Option[String]

    Permalink

    get partition values formatted by partition layout

    get partition values formatted by partition layout

    Definition Classes
    FileRefDataObject
  22. def getPath: String

    Permalink

    Method for subclasses to override the base path for this DataObject.

    Method for subclasses to override the base path for this DataObject. This is for instance needed if pathPrefix is defined in a connection.

    Definition Classes
    FileRefDataObject
  23. def getSearchPaths(partitionValues: Seq[PartitionValues])(implicit session: SparkSession): Seq[(PartitionValues, String)]

    Permalink

    prepare paths to be searched

    prepare paths to be searched

    Attributes
    protected
    Definition Classes
    FileRefDataObject
  24. val id: DataObjectId

    Permalink

    A unique identifier for this instance.

    A unique identifier for this instance.

    Definition Classes
    SFtpFileRefDataObjectDataObject → SdlConfigObject
  25. implicit val instanceRegistry: InstanceRegistry

    Permalink
  26. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  27. def listPartitions(implicit session: SparkSession): Seq[PartitionValues]

    Permalink

    List partitions on data object's root path

    List partitions on data object's root path

    Definition Classes
    SFtpFileRefDataObjectCanHandlePartitions
  28. lazy val logger: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    SmartDataLakeLogger
  29. val metadata: Option[DataObjectMetadata]

    Permalink

    Additional metadata for the DataObject

    Additional metadata for the DataObject

    Definition Classes
    SFtpFileRefDataObjectDataObject
  30. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  31. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  32. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  33. val partitionLayout: Option[String]

    Permalink

    partition layout defines how partition values can be extracted from the path.

    partition layout defines how partition values can be extracted from the path. Use "%<colname>%" as token to extract the value for a partition column. With "%<colname:regex>%" a regex can be given to limit search. This is especially useful if there is no char to delimit the last token from the rest of the path or also between two tokens.

    Definition Classes
    SFtpFileRefDataObject → FileRefDataObject
  34. val partitions: Seq[String]

    Permalink

    Definition of partition columns

    Definition of partition columns

    Definition Classes
    SFtpFileRefDataObjectCanHandlePartitions
  35. val path: String

    Permalink

    The root path of the files that are handled by this DataObject.

    The root path of the files that are handled by this DataObject.

    Definition Classes
    SFtpFileRefDataObject → FileDataObject
  36. def prepare(implicit session: SparkSession): Unit

    Permalink

    Prepare & test DataObject's prerequisits

    Prepare & test DataObject's prerequisits

    This runs during the "prepare" operation of the DAG.

    Definition Classes
    SFtpFileRefDataObject → FileDataObject → DataObject
  37. val saveMode: SaveMode

    Permalink

    Overwrite or Append new data.

    Overwrite or Append new data.

    Definition Classes
    SFtpFileRefDataObject → FileRefDataObject
  38. val separator: Char

    Permalink

    default separator for paths

    default separator for paths

    Attributes
    protected
    Definition Classes
    FileDataObject
  39. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  40. def toStringShort: String

    Permalink
    Definition Classes
    DataObject
  41. def translateFileRefs(fileRefs: Seq[FileRef])(implicit session: SparkSession): Seq[FileRef]

    Permalink

    Given some FileRefs for another DataObject, translate the paths to the root path of this DataObject

    Given some FileRefs for another DataObject, translate the paths to the root path of this DataObject

    Definition Classes
    FileRefDataObject
  42. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  43. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from CanCreateOutputStream

Inherited from CanCreateInputStream

Inherited from FileRefDataObject

Inherited from FileDataObject

Inherited from CanHandlePartitions

Inherited from DataObject

Inherited from SmartDataLakeLogger

Inherited from ParsableFromConfig[DataObject]

Inherited from SdlConfigObject

Inherited from AnyRef

Inherited from Any

Ungrouped