io.smartdatalake.workflow.dataobject
A reader that reads ActionId values.
A reader that reads ActionId values.
A reader that reads ConnectionId values.
A reader that reads ConnectionId values.
A reader that reads DataObjectId values.
A reader that reads DataObjectId values.
A reader that reads ParsableDfTransformer values.
A reader that reads ParsableDfTransformer values. Note that DfSparkTransformer must be parsed according to it's 'type' attribute by using SDL ConfigParser.
A reader that reads ParsableDfTransformer values.
A reader that reads ParsableDfTransformer values. Note that DfSparkTransformer must be parsed according to it's 'type' attribute by using SDL ConfigParser.
A ConfigReader reader that reads OutputMode.
A ConfigReader reader that reads OutputMode.
default naming strategy is to allow lowerCamelCase and hypen-separated key naming, and fail on superfluous keys
default naming strategy is to allow lowerCamelCase and hypen-separated key naming, and fail on superfluous keys
A ConfigReader reader that reads StructType values.
A ConfigReader reader that reads StructType values.
This reader parses a StructType from a DDL string.
This is a workaround needed with Scala 2.11 because configs doesn't read default values correctly in a scope with many macros. If we let scala process the macro in a smaller scope, default values are handled correctly.