Package

io.prophecy

libs

Permalink

package libs

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. libs
  2. Extension
  3. FixedFileFormatImplicits
  4. SparkFunctions
  5. DataHelpers
  6. Component
  7. UDFUtils
  8. Serializable
  9. Serializable
  10. RestAPIUtils
  11. LazyLogging
  12. ProphecyDataFrame
  13. AnyRef
  14. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. type Aggregate = Dataset[Row]

    Permalink
    Definition Classes
    Component
  2. class CDC extends AnyRef

    Permalink
  3. class CLIConf extends ScallopConf

    Permalink
  4. trait Component extends AnyRef

    Permalink
  5. trait ConfigBase extends AnyRef

    Permalink
  6. abstract class ConfigurationFactory[C <: ConfigBase] extends AnyRef

    Permalink
  7. type CreateData = Dataset[Row]

    Permalink
    Definition Classes
    Component
  8. type DataFrame1 = Dataset[Row]

    Permalink
    Definition Classes
    Component
  9. type DataFrame10 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  10. type DataFrame11 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  11. type DataFrame12 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  12. type DataFrame13 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  13. type DataFrame14 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  14. type DataFrame15 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  15. type DataFrame16 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  16. type DataFrame17 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  17. type DataFrame18 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  18. type DataFrame19 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  19. type DataFrame2 = (DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  20. type DataFrame20 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  21. type DataFrame21 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  22. type DataFrame22 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  23. type DataFrame3 = (DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  24. type DataFrame4 = (DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  25. type DataFrame5 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  26. type DataFrame6 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  27. type DataFrame7 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  28. type DataFrame8 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  29. type DataFrame9 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  30. trait DataHelpers extends LazyLogging

    Permalink

    Helper Utilities for reading/writing data from/to different data sources.

  31. type DataQualityTest = Dataset[Row]

    Permalink
    Definition Classes
    Component
  32. type DatabaseInput = Dataset[Row]

    Permalink
    Definition Classes
    Component
  33. type Deduplicate = Dataset[Row]

    Permalink
    Definition Classes
    Component
  34. case class Description(comment: String) extends Annotation with StaticAnnotation with Product with Serializable

    Permalink
  35. implicit class ExceptionExtension extends AnyRef

    Permalink
  36. implicit class ExtendedDataFrame extends AnyRef

    Permalink
    Definition Classes
    ProphecyDataFrame
  37. implicit class ExtendedDataFrameGlobal extends ExtendedDataFrame

    Permalink
  38. implicit class ExtendedStreamingTarget extends AnyRef

    Permalink
    Definition Classes
    ProphecyDataFrame
  39. implicit class ExtendedStreamingTargetGlobal extends ExtendedStreamingTarget

    Permalink
  40. trait FFAST extends Positional

    Permalink
  41. case class FFCompoundSchemaRow(compound: FFCompoundType, rows: Seq[FFSchemaRow]) extends FFSchemaRow with Product with Serializable

    Permalink
  42. sealed trait FFCompoundType extends FFAST

    Permalink
  43. case class FFConditionalSchemaRow(condition: String, schemaRow: FFSchemaRow) extends FFSchemaRow with Product with Serializable

    Permalink
  44. sealed trait FFDataFormat extends FFAST

    Permalink
  45. case class FFDateFormat(name: FFTypeName, format: Option[String], miscProperties: Map[String, Any] = Map()) extends FFDataFormat with Product with Serializable

    Permalink
  46. case class FFDateTimeFormat(name: FFTypeName, format: Option[String], miscProperties: Map[String, Any] = Map()) extends FFDataFormat with Product with Serializable

    Permalink
  47. sealed trait FFDefaultVal extends FFAST

    Permalink
  48. case class FFDoubleDefaultVal(value: Double) extends FFDefaultVal with Product with Serializable

    Permalink
  49. case class FFExpressionDefaultVal(value: CustomExpression) extends FFDefaultVal with Product with Serializable

    Permalink
  50. case class FFIncludeFileRow(filePath: String) extends FFSchemaRow with Product with Serializable

    Permalink
  51. case class FFIntDefaultVal(value: Int) extends FFDefaultVal with Product with Serializable

    Permalink
  52. case class FFNoDefaultVal() extends FFDefaultVal with Product with Serializable

    Permalink
  53. case class FFNullDefaultVal(value: Option[Any] = None) extends FFDefaultVal with Product with Serializable

    Permalink
  54. case class FFNumberArrayFormat(name: FFTypeName, precision: Option[Int], scale: Option[Int], arraySizeInfo: Option[String], miscProperties: Map[String, Any] = ...) extends FFDataFormat with Product with Serializable

    Permalink
  55. case class FFNumberFormat(name: FFTypeName, precision: Option[Int], scale: Option[Int], miscProperties: Map[String, Any] = ...) extends FFDataFormat with Product with Serializable

    Permalink
  56. case class FFRecordType(startType: String) extends FFAST with Product with Serializable

    Permalink
  57. case class FFSchemaRecord(recordType: String, rows: Seq[FFSchemaRow]) extends FFAST with Product with Serializable

    Permalink
  58. sealed trait FFSchemaRow extends FFAST

    Permalink
  59. case class FFSimpleSchemaList(rows: Seq[FFSimpleSchemaRow]) extends FFSchemaRow with Product with Serializable

    Permalink
  60. case class FFSimpleSchemaRow(name: String, format: FFDataFormat, value: FFDefaultVal) extends FFSchemaRow with Product with Serializable

    Permalink
  61. case class FFStringArrayFormat(name: FFTypeName, precision: Option[Int], arraySizeInfo: Option[String]) extends FFDataFormat with Product with Serializable

    Permalink
  62. case class FFStringDefaultVal(value: String) extends FFDefaultVal with Product with Serializable

    Permalink
  63. case class FFStringFormat(name: FFTypeName, precision: Option[Int], props: Option[Map[String, String]] = None) extends FFDataFormat with Product with Serializable

    Permalink
  64. case class FFStructArrayType(name1: String, arraySizeInfo: Option[String], typeName: Option[String] = None) extends FFCompoundType with Product with Serializable

    Permalink
  65. case class FFStructFormat(name: FFTypeName, precision: Option[Int]) extends FFDataFormat with Product with Serializable

    Permalink
  66. case class FFStructType(name1: String, typeName: Option[String] = None) extends FFCompoundType with Product with Serializable

    Permalink
  67. case class FFTypeName(name: String, delimiter: Option[String]) extends FFAST with Product with Serializable

    Permalink
  68. case class FFTypeNameWithProperties(name: String, delimiter: Option[String], miscProperties: Map[String, Any] = Map("packed" → false)) extends FFAST with Product with Serializable

    Permalink
  69. case class FFUnionType(name: Option[String] = None, typeName: Option[String] = None) extends FFCompoundType with Product with Serializable

    Permalink
  70. case class FFUnknownFormat(name: FFTypeName, arraySizeInfo: Option[String]) extends FFDataFormat with Product with Serializable

    Permalink
  71. case class FFVoidFormat(name: FFTypeName, size: Option[Int]) extends FFDataFormat with Product with Serializable

    Permalink
  72. type FileInput = Dataset[Row]

    Permalink
    Definition Classes
    Component
  73. type FileIntermediate = Dataset[Row]

    Permalink
    Definition Classes
    Component
  74. type FileOutput = Unit

    Permalink
    Definition Classes
    Component
  75. type Filter = Dataset[Row]

    Permalink
    Definition Classes
    Component
  76. class FixedFileFormat extends FileFormat with DataSourceRegister with Serializable

    Permalink
  77. implicit class FixedFileFormatDataFrame extends AnyRef

    Permalink
    Definition Classes
    FixedFileFormatImplicits
  78. implicit class FixedFileFormatDataFrameGlobal extends FixedFileFormatDataFrame

    Permalink
  79. trait FixedFileFormatImplicits extends AnyRef

    Permalink
  80. implicit class FixedFileFormatSpark extends AnyRef

    Permalink
    Definition Classes
    FixedFileFormatImplicits
  81. type FixedFileOutput = Unit

    Permalink
    Definition Classes
    Component
  82. class FixedFormatOutputWriter extends OutputWriter

    Permalink
  83. type FlattenSchema = Dataset[Row]

    Permalink
    Definition Classes
    Component
  84. type Generate = Dataset[Row]

    Permalink
    Definition Classes
    Component
  85. type HashPartition = Dataset[Row]

    Permalink
    Definition Classes
    Component
  86. type Join = Dataset[Row]

    Permalink
    Definition Classes
    Component
  87. type Limit = Dataset[Row]

    Permalink
    Definition Classes
    Component
  88. type Lookup = UserDefinedFunction

    Permalink
    Definition Classes
    Component
  89. case class LookupDataset(datasetId: String, columnName: String) extends Annotation with StaticAnnotation with Product with Serializable

    Permalink
  90. type LookupFileInput = UserDefinedFunction

    Permalink
    Definition Classes
    Component
  91. type LookupUnit = Unit

    Permalink
    Definition Classes
    Component
  92. trait LookupUtils extends AnyRef

    Permalink
  93. class MDumpReader extends AnyRef

    Permalink
  94. type MultiFileRead = Dataset[Row]

    Permalink
    Definition Classes
    Component
  95. type MultiFileWrite = Unit

    Permalink
    Definition Classes
    Component
  96. type MultiFileWriteUnit = Unit

    Permalink
    Definition Classes
    Component
  97. type MultiJoin = Dataset[Row]

    Permalink
    Definition Classes
    Component
  98. type Normalize = Dataset[Row]

    Permalink
    Definition Classes
    Component
  99. type OrderBy = Dataset[Row]

    Permalink
    Definition Classes
    Component
  100. type OrderByPartition = Dataset[Row]

    Permalink
    Definition Classes
    Component
  101. type Prepare = Dataset[Row]

    Permalink
    Definition Classes
    Component
  102. implicit class ProphecyDataFrameReader extends AnyRef

    Permalink
    Definition Classes
    ProphecyDataFrame
  103. implicit class ProphecyDataFrameWriter[T] extends AnyRef

    Permalink
    Definition Classes
    ProphecyDataFrame
  104. type ReadSV = Dataset[Row]

    Permalink
    Definition Classes
    Component
  105. type Reformat = Dataset[Row]

    Permalink
    Definition Classes
    Component
  106. type Repartition = Dataset[Row]

    Permalink
    Definition Classes
    Component
  107. trait RestAPIUtils extends LazyLogging

    Permalink

    Spark utilities for handling rest api connections.

  108. type RoundRobinPartition = Dataset[Row]

    Permalink
    Definition Classes
    Component
  109. type RowDistributor = Dataset[Row]

    Permalink
    Definition Classes
    Component
  110. type RowDistributor1 = Dataset[Row]

    Permalink
    Definition Classes
    Component
  111. type RowDistributor10 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  112. type RowDistributor11 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  113. type RowDistributor12 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  114. type RowDistributor13 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  115. type RowDistributor14 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  116. type RowDistributor15 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  117. type RowDistributor16 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  118. type RowDistributor17 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  119. type RowDistributor18 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  120. type RowDistributor19 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  121. type RowDistributor2 = (DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  122. type RowDistributor20 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  123. type RowDistributor21 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  124. type RowDistributor22 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  125. type RowDistributor3 = (DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  126. type RowDistributor4 = (DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  127. type RowDistributor5 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  128. type RowDistributor6 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  129. type RowDistributor7 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  130. type RowDistributor8 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  131. type RowDistributor9 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  132. type Ruleset = Dataset[Row]

    Permalink
    Definition Classes
    Component
  133. type SQLStatement = Dataset[Row]

    Permalink
    Definition Classes
    Component
  134. type SQLStatement1 = Dataset[Row]

    Permalink
    Definition Classes
    Component
  135. type SQLStatement10 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  136. type SQLStatement11 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  137. type SQLStatement12 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  138. type SQLStatement13 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  139. type SQLStatement14 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  140. type SQLStatement15 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  141. type SQLStatement16 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  142. type SQLStatement17 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  143. type SQLStatement18 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  144. type SQLStatement19 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  145. type SQLStatement2 = (DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  146. type SQLStatement20 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  147. type SQLStatement21 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  148. type SQLStatement22 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  149. type SQLStatement3 = (DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  150. type SQLStatement4 = (DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  151. type SQLStatement5 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  152. type SQLStatement6 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  153. type SQLStatement7 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  154. type SQLStatement8 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  155. type SQLStatement9 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  156. type SQLStatementUnit = Unit

    Permalink
    Definition Classes
    Component
  157. type Scan = Dataset[Row]

    Permalink
    Definition Classes
    Component
  158. type SchemaTransformer = Dataset[Row]

    Permalink
    Definition Classes
    Component
  159. type Script = Dataset[Row]

    Permalink
    Definition Classes
    Component
  160. type Script1 = Dataset[Row]

    Permalink
    Definition Classes
    Component
  161. type Script10 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  162. type Script11 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  163. type Script12 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  164. type Script13 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  165. type Script14 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  166. type Script15 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  167. type Script16 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  168. type Script17 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  169. type Script18 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  170. type Script19 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  171. type Script2 = (DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  172. type Script20 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  173. type Script21 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  174. type Script22 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  175. type Script3 = (DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  176. type Script4 = (DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  177. type Script5 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  178. type Script6 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  179. type Script7 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  180. type Script8 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  181. type Script9 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  182. type ScriptUnit = Unit

    Permalink
    Definition Classes
    Component
  183. type Select = Dataset[Row]

    Permalink
    Definition Classes
    Component
  184. type Sequence = Dataset[Row]

    Permalink
    Definition Classes
    Component
  185. type SetOperation = Dataset[Row]

    Permalink
    Definition Classes
    Component
  186. type Source = Dataset[Row]

    Permalink
    Definition Classes
    Component
  187. trait SparkFunctions extends AnyRef

    Permalink

    Library of all spark functions which implements different abinitio functions used in abinitio workflows.

  188. type StreamingTarget = StreamingQuery

    Permalink
    Definition Classes
    Component
  189. class StringAsStream extends Serializable

    Permalink
    Definition Classes
    SparkFunctions
  190. type SubGraph = Dataset[Row]

    Permalink
    Definition Classes
    Component
  191. type SubGraph1 = Dataset[Row]

    Permalink
    Definition Classes
    Component
  192. type SubGraph10 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  193. type SubGraph11 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  194. type SubGraph12 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  195. type SubGraph13 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  196. type SubGraph14 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  197. type SubGraph15 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  198. type SubGraph16 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  199. type SubGraph17 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  200. type SubGraph18 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  201. type SubGraph19 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  202. type SubGraph2 = (DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  203. type SubGraph20 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  204. type SubGraph21 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  205. type SubGraph22 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  206. type SubGraph3 = (DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  207. type SubGraph4 = (DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  208. type SubGraph5 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  209. type SubGraph6 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  210. type SubGraph7 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  211. type SubGraph8 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  212. type SubGraph9 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  213. type SubGraphUnit = Unit

    Permalink
    Definition Classes
    Component
  214. type Subgraph = Dataset[Row]

    Permalink
    Definition Classes
    Component
  215. type Subgraph1 = Dataset[Row]

    Permalink
    Definition Classes
    Component
  216. type Subgraph10 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  217. type Subgraph11 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  218. type Subgraph12 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  219. type Subgraph13 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  220. type Subgraph14 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  221. type Subgraph15 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  222. type Subgraph16 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  223. type Subgraph17 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  224. type Subgraph18 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  225. type Subgraph19 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  226. type Subgraph2 = (DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  227. type Subgraph20 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  228. type Subgraph21 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  229. type Subgraph22 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  230. type Subgraph3 = (DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  231. type Subgraph4 = (DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  232. type Subgraph5 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  233. type Subgraph6 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  234. type Subgraph7 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  235. type Subgraph8 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  236. type Subgraph9 = (DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame, DataFrame)

    Permalink
    Definition Classes
    Component
  237. type SubgraphUnit = Unit

    Permalink
    Definition Classes
    Component
  238. type Target = Unit

    Permalink
    Definition Classes
    Component
  239. trait UDFUtils extends RestAPIUtils with Serializable with LazyLogging

    Permalink

    Utility class with different UDFs to take care of miscellaneous tasks.

  240. type UnionAll = Dataset[Row]

    Permalink
    Definition Classes
    Component
  241. case class UsesDataset(id: String, version: Int = 1) extends Annotation with StaticAnnotation with Product with Serializable

    Permalink
    Definition Classes
    Component
  242. case class UsesRuleset(id: String) extends Annotation with StaticAnnotation with Product with Serializable

    Permalink
    Definition Classes
    Component
  243. case class Visual(id: String = "ID", label: String = "Label", x: Long = 0, y: Long = 0, phase: Int = 0, mode: String = "batch", interimMode: String = "full", detailedStats: Boolean = false) extends Annotation with StaticAnnotation with Product with Serializable

    Permalink
    Definition Classes
    Component
  244. type Visualize = Unit

    Permalink
    Definition Classes
    Component
  245. type WindowFunction = Dataset[Row]

    Permalink
    Definition Classes
    Component

Abstract Value Members

  1. abstract def getClass(): Class[_]

    Permalink
    Definition Classes
    Any

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    Any
  4. object AbinitioDMLs

    Permalink
  5. object CDC

    Permalink

    Column Dependency Calculator for two:

  6. val Calculate_TAT_Hours_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  7. object Component

    Permalink
  8. object DataFrameValidator

    Permalink
  9. object DataHelpers

    Permalink
  10. object FixedFileFormatImplicits

    Permalink
  11. object FixedFormatHelper

    Permalink
  12. object FixedFormatSchemaImplicits

    Permalink
  13. val Get_Clean_Date_Local_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  14. val Get_Holiday_Cnt_Diff_Value: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  15. val Get_Holiday_Cnt_Values: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  16. val Get_Ship_Date_Local_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  17. val Get_V_Final_Days_To_Ship_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  18. val Get_V_Final_Tat_Hours_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  19. val Get_full_day_of_week_from_number: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  20. val InterimState: InterimStore.type

    Permalink
    Definition Classes
    ProphecyDataFrame
  21. object LongSequence

    Permalink
    Definition Classes
    SparkFunctions
  22. object RestAPIUtils extends LazyLogging

    Permalink
  23. object SchemaUtils

    Permalink
  24. object SparkFunctions

    Permalink
  25. object SparkTestingUtils

    Permalink
  26. val Weekend_Day_Cnt_Inner: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  27. def YJJJ_to_YYYYJJJ(in_date: Column, ref_date: Column): Column

    Permalink

    Converts 1 digit julian year to 4 digits julian year.

    Converts 1 digit julian year to 4 digits julian year.

    in_date

    date in Julian in "YJJJ" format

    ref_date

    date in "yyyyMMdd" format

    returns

    a date in "YYYYJJJ"

    Definition Classes
    SparkFunctions
  28. val adjustCenturyDateInCyyFormat: UserDefinedFunction

    Permalink

    begining of input should have Cyy

    begining of input should have Cyy

    Definition Classes
    SparkFunctions
  29. def adjustStringRegexPattern(input: String): String

    Permalink
    Definition Classes
    SparkFunctions
  30. def appendTrailer(pathInputData: String, pathInputTrailer: String, pathOutputConcatenated: String, configuration: Configuration): Unit

    Permalink

    Appends a trailer data to every single file in the data directory.

    Appends a trailer data to every single file in the data directory. A single trailer file in the pathOutputTrailer directory should correspond to a single data file in the pathOutputData directory.

    If a trailer for a given file does not exist, the file is moved as is to the output directory.

    pathInputData

    Input data files directory

    pathInputTrailer

    Input trailer files directory

    pathOutputConcatenated

    Output concatenated files directory

    configuration

    Hadoop configuration (preferably sparkSession.sparkContext.hadoopConfiguration)

    Definition Classes
    DataHelpers
  31. def arrayColumn(value: String, values: String*): Column

    Permalink

    Function to take variable number of values and create an array column out of it.

    Function to take variable number of values and create an array column out of it.

    value

    input value

    values

    variable number of input values.

    returns

    an array of column.

    Definition Classes
    UDFUtils
  32. val array_value: UserDefinedFunction

    Permalink

    UDF to find and return element in arr sequence at passed index.

    UDF to find and return element in arr sequence at passed index. If no element found then null is returned.

    Definition Classes
    UDFUtils
  33. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  34. val bigDecimalToPackedBytes: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  35. val call_rest_api: UserDefinedFunction

    Permalink

    Spark UDF that makes a single blocking rest API call to a given url.

    Spark UDF that makes a single blocking rest API call to a given url. The result of this udf is always produced, contains a proper error if it failed at any stage, and never interrupts the job execution (unless called with invalid signature).

    The default timeout can be configured through the spark.network.timeout Spark configuration option.

    Parameters:

    • method - any supported HTTP1.1 method type, e.g. POST, GET. Complete list: [httpMethods].
    • url - valid url to which a request is going to be made
    • headers - an array of "key: value" headers that are past with the request
    • content - any content (by default, the supported rest api content type is application/json)

    Response - a struct with the following fields:

    • isSuccess - boolean, whether a successful response has been received
    • status - nullable integer, status code (e.g. 404, 200, etc)
    • headers - an array of name: value response headers (e.g. [Server: akka-http/10.1.10, Date: Tue, 07 Sep 2021 18:11:47 GMT])
    • content - nullable string, response back
    • error - nullable string, if the parameters passed are valid or the system failed to make a call, this field contains an error message
    Definition Classes
    RestAPIUtils
  36. def call_udf(udfName: String, cols: Column*): Column

    Permalink

    Taken from upstream Spark

    Taken from upstream Spark

    Definition Classes
    UDFUtils
    Annotations
    @varargs()
  37. val canonical_representation: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  38. def castDataType(sparkSession: SparkSession, df: DataFrame, column: Column, dataType: String, replaceColumn: String): DataFrame

    Permalink

    Function to add new typecasted column in input dataframe.

    Function to add new typecasted column in input dataframe. Newly added column is typecasted version of passed column. Typecast operation is supported for string, boolean, byte, short, int, long, float, double, decimal, date, timestamp

    sparkSession

    spark session

    df

    input dataframe

    column

    input column to be typecasted

    dataType

    datatype to cast column to.

    replaceColumn

    column name to be added in dataframe.

    returns

    new dataframe with new typecasted column.

    Definition Classes
    UDFUtils
  39. val char_string: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  40. def concatenate(sources: Seq[String], destination: String, compressToGZip: Boolean = false): Unit

    Permalink

    Method to get data from multiple source paths and combine it into single destination path.

    Method to get data from multiple source paths and combine it into single destination path.

    sources

    multiple source paths from which to merge the data.

    destination

    destination path to combine all data to.

    compressToGZip

    flag to compress final output file into gzip format

    Definition Classes
    DataHelpers
  41. def concatenateFiles(spark: SparkSession, format: String, mode: String, inputDir: String, outputFileName: String, deleteTempPath: Boolean = true, fileFormatHasHeaders: Boolean = false): Unit

    Permalink

    Method to get data from multiple part files in source directory and combine it into single file.

    Method to get data from multiple part files in source directory and combine it into single file.

    spark

    spark session

    format

    file extension. e.g.: ".csv", ".txt"

    mode

    write mode in spark. Can be overwrite, append, error and ignore

    inputDir

    input directory containing part files

    outputFileName

    output single file path

    deleteTempPath

    flag to delete temp source directory

    fileFormatHasHeaders

    flag to exclude headers from file

    Definition Classes
    DataHelpers
  42. def convertInputBytesToStructType(input: Any, typeInfo: Seq[String], startByte: Int = 0): Row

    Permalink

    Method used for abinitio's reinterpret_as function to read necessary bytes from byteArray for input data and convert into struct format as per provided in typeInfo sequence.

    Method used for abinitio's reinterpret_as function to read necessary bytes from byteArray for input data and convert into struct format as per provided in typeInfo sequence.

    TypeInfo can have multiple entries, each could be either decimal or string type. Depending on the argument passed within decimal or string bytes are read from input byte array.

    If decimal or string argument has some integer then that many bytes are read from input byte array or if decimal or string has some string delimiter as its argument then from the current position bytes are read until string delimiter is found in input byte array.

    Definition Classes
    SparkFunctions
  43. package core

    Permalink
  44. def createDataFrameFromData(inputData: String, delimiter: String, columnName: String, columnType: String, sparkSession: SparkSession): DataFrame

    Permalink

    Method to read values from inputData and create dataframe with column name as columnName and column type as columnType for the values in inputData delimiter by delimiter.

    Method to read values from inputData and create dataframe with column name as columnName and column type as columnType for the values in inputData delimiter by delimiter.

    Definition Classes
    SparkFunctions
  45. def createLookup(name: String, df: DataFrame, spark: SparkSession, keyCols: List[String], rowCols: String*): UserDefinedFunction

    Permalink

    Function registers 4 different UDFs with spark registry.

    Function registers 4 different UDFs with spark registry. UDF for lookup_match, lookup_count, lookup_row and lookup functions are registered. This function stores the data of input dataframe in a broadcast variable, then uses this broadcast variable in different lookup functions.

    lookup : This function returns the first matching row for given input keys lookup_count : This function returns the count of all matching rows for given input keys. lookup_match : This function returns 0 if there is no matching row and 1 for some matching rows for given input keys. lookup_row : This function returns all the matching rows for given input keys.

    This function registers for upto 10 matching keys as input to these lookup functions.

    name

    UDF Name

    df

    input dataframe

    spark

    spark session

    keyCols

    columns to be used as keys in lookup functions.

    rowCols

    schema of entire row which will be stored for each matching key.

    returns

    registered UDF definitions for lookup functions. These UDF functions returns different results depending on the lookup function.

    Definition Classes
    UDFUtils
  46. def createRangeLookup(name: String, df: DataFrame, spark: SparkSession, minColumn: String, maxColumn: String, valueColumns: String*): UserDefinedFunction

    Permalink

    Method to create UDF which looks for passed input double in input dataframe.

    Method to create UDF which looks for passed input double in input dataframe. This function first loads the data of dataframe in broadcast variable and then defines a UDF which looks for input double value in the data stored in broadcast variable. If input double lies between passed col1 and col2 values then it adds corresponding row in the returned result. If value of input double doesn't lie between col1 and col2 then it simply returns null for current row in result.

    name

    created UDF name

    df

    input dataframe

    spark

    spark session

    minColumn

    column whose value to be considered as minimum in comparison.

    maxColumn

    column whose value to be considered as maximum in comparison.

    valueColumns

    remaining column names to be part of result.

    returns

    registers UDF which in turn returns rows corresponding to each row in dataframe on which range UDF is called.

    Definition Classes
    UDFUtils
  47. implicit def createSparkSessionExtension(spark: SparkSession): ProphecySparkSession

    Permalink
    Definition Classes
    Extension
  48. val cross_join_index_range: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  49. package crosssupport

    Permalink
  50. package data

    Permalink
  51. def date_add_months(inputDate: Column, months: Int): Column

    Permalink

    Returns the internal representation of a date resulting from adding (or subtracting) a number of months to the specified date.

    Returns the internal representation of a date resulting from adding (or subtracting) a number of months to the specified date.

    inputDate

    in yyyy-MM-dd format

    Definition Classes
    SparkFunctions
  52. def date_difference_days(laterDate: Column, earlierDate: Column): Column

    Permalink

    Computes number of days between two specified dates in "yyyyMMdd" format

    Computes number of days between two specified dates in "yyyyMMdd" format

    laterDate

    input date

    earlierDate

    input date

    returns

    number of days between laterDate and earlierDate or null if either one is null

    Definition Classes
    SparkFunctions
  53. val date_month_end: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  54. val datetime_add: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  55. def datetime_add_months(input: Column, months: Int): Column

    Permalink

    Returns the internal representation of a timestamp resulting from adding (or subtracting) a number of months to the specified timestamp.

    Returns the internal representation of a timestamp resulting from adding (or subtracting) a number of months to the specified timestamp.

    input

    timestamp in yyyy-MM-dd HH:mm:ss.SSSS format

    Definition Classes
    SparkFunctions
  56. val datetime_difference: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  57. def datetime_difference_hours(end: Column, start: Column): Column

    Permalink

    Returns the number of hours between two specified dates in standard format yyyy-MM-dd HH:mm:ss.SSSS.

    Returns the number of hours between two specified dates in standard format yyyy-MM-dd HH:mm:ss.SSSS.

    Definition Classes
    SparkFunctions
  58. def datetime_difference_minutes(end: Column, start: Column): Column

    Permalink

    Returns the number of minutes between two specified dates in standard format yyyy-MM-dd HH:mm:ss.SSSS.

    Returns the number of minutes between two specified dates in standard format yyyy-MM-dd HH:mm:ss.SSSS.

    Definition Classes
    SparkFunctions
  59. def datetime_from_unixtime(seconds: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  60. def decimal_lpad(input: Column, len: Int, char_to_pad_with: String = "0", decimal_point_char: String = "."): Column

    Permalink

    Method uses a java regex to identify decimal numbers from input string.

    Method uses a java regex to identify decimal numbers from input string. This decimal number could be of 3 types 1. Simple integral number. e.g. 013334848. This part is identified by regex. 2. decimal number with explicit decimal point. e.g. 123456.90. This part is identified by combination of [0-9]+(\$$decimal_point_char)[0-9]+ and (0\$$decimal_point_char)[0-9]+ regex

    After extracting decimal number this code checks if length of decimal number is more than len parameter or not. If length is more than len parameter then it simply returns this extracted decimal number. Otherwise it first left pad decimal number with char_to_pad_with to make its length equal to len parameter and then adjusts minus sign (-) to left most part of decimal number.

    input

    input string.

    len

    length of characters.

    char_to_pad_with

    character to left pad with. default value is "0"

    decimal_point_char

    A string that specifies the character that represents the decimal point.

    returns

    a decimal string of the specified length or longer, left-padded with a specified character as needed and trimmed of leading zeros.

    Definition Classes
    SparkFunctions
  61. def decimal_lrepad(input: Column, len: Int, char_to_pad_with: String = "0", decimal_point_char: String = "."): Column

    Permalink

    Method uses a java regex to identify decimal numbers from input string.

    Method uses a java regex to identify decimal numbers from input string. This decimal number could be of 3 types 1. Simple integral number. e.g. 013334848. This part is identified by combination of [1-9][0-9]*[0-9] and [1-9]+ regex 2. decimal number with explicit decimal point. e.g. 123456.90. This part is identified by combination of [1-9][0-9]*(\\\$$decimal_point_char)[0-9]+ and (0\\\$$decimal_point_char)[0-9]*[0-9] regex

    After extracting decimal number this code checks if length of decimal number is more than len parameter or not. If length is more than len parameter then it simply returns this extracted decimal number. Otherwise it first left pad decimal number with char_to_pad_with to make its length equal to len parameter and then adjusts minus sign (-) to left most part of decimal number.

    input

    input string.

    len

    length of characters.

    char_to_pad_with

    character to left pad with. default value is "0"

    decimal_point_char

    A string that specifies the character that represents the decimal point.

    returns

    a decimal string of the specified length or longer, left-padded with a specified character as needed and trimmed of leading zeros.

    Definition Classes
    SparkFunctions
  62. def decimal_round(input: Column, places: Int): Column

    Permalink
    Definition Classes
    SparkFunctions
  63. def decimal_round_down(input: Column, right_digits: Int): Column

    Permalink

    Function returns a value which is rounded down to right_digits number of digits to the right of decimal point.

    Function returns a value which is rounded down to right_digits number of digits to the right of decimal point.

    Definition Classes
    SparkFunctions
  64. def decimal_round_up(input: Column, places: Int): Column

    Permalink

    Returns a number rounded up to a specified number of places to the right of the decimal point.

    Returns a number rounded up to a specified number of places to the right of the decimal point.

    Definition Classes
    SparkFunctions
  65. def decimal_strip(input: Column, decimal_point_char: String = "."): Column

    Permalink

    Function uses a java regex to identify decimal numbers from input string.

    Function uses a java regex to identify decimal numbers from input string. This decimal number could be of 3 types 1. Simple integral number. e.g. 013334848. This part is identified by combination of [1-9][0-9 ]*[0-9] and [1-9]+ regex 2. decimal number with explicit decimal point. e.g. 123456.90. This part is identified by combination of [1-9][0-9]*(\$$decimal_point_char)[0-9 ]+ and (0\$$decimal_point_char)[0-9 ]*[0-9] regex

    After extracting decimal number this code looks for minus sign before extracted number in input and appends it with decimal number if found minus sign.

    In the end it replaces all whitespaces with empty string in the final resultant decimal number.

    input

    input string

    decimal_point_char

    A string that specifies the character that represents the decimal point.

    returns

    a decimal from a string that has been trimmed of leading zeros and non-numeric characters.

    Definition Classes
    SparkFunctions
  66. def decimal_truncate(input: Column, number_of_places: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  67. val decodeBytes: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  68. val decodeString: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  69. val decode_datetime: UserDefinedFunction

    Permalink

    UDF to get record of type decode_datetime_type.

    UDF to get record of type decode_datetime_type. This record will have all its fields populated with corresponding entries in input date/timestamp.

    Returned record will have following schema.

    integer(8) year; integer(8) month; integer(8) day; integer(8) hour; integer(8) minute; integer(8) second; integer(8) microsecond;

    Note: Supported Input time is in yyyy-MM-dd HH:mm:ss.SSSSSS or yyyy-MM-dd HH:mm:ss or yyyy-MM-dd formats only. Additional handling is done to support timestamp retrieved from now() function call.

    Definition Classes
    SparkFunctions
  70. val decode_datetime_as_local: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  71. def directory_listing(path: String, filePrefix: String): Column

    Permalink
    Definition Classes
    SparkFunctions
  72. def dropColumns(sparkSession: SparkSession, df: DataFrame, columns: Column*): DataFrame

    Permalink

    Function to drop passed columns from input dataframe.

    Function to drop passed columns from input dataframe.

    sparkSession

    spark session

    df

    input dataframe.

    columns

    list of columns to be dropped from dataframe.

    returns

    new dataframe with dropped columns.

    Definition Classes
    UDFUtils
  73. val encodeBytes: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  74. val encodeString: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  75. val encode_date: UserDefinedFunction

    Permalink

    integer values specifying days relative to January 1, 1900.

    integer values specifying days relative to January 1, 1900. This function returns the internal representation of a date given the year, month, and date. encode_date returns the internal representation of the date specified by the year 1998, the month 5, and the day 18:encode_date(1998, 5, 18) = 35931

    Definition Classes
    SparkFunctions
  76. val encrypt_idwdata: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  77. def ends_with(input: Column, suffix: String): Column

    Permalink

    Returns true if string columns ends with given suffix

    Returns true if string columns ends with given suffix

    Definition Classes
    SparkFunctions
  78. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    Any
  79. val eval: UserDefinedFunction

    Permalink

    Method to return the result of evaluating a string expression in the context of a specified input column.

    Method to return the result of evaluating a string expression in the context of a specified input column. Here input column could be struct type record, simple column, array type etc. Here expr could be reference to nested column inside input column or any expression which requires values from input column for its evaulation.

    Note: Current implementation only supports scenerio where input column is of struct type and expr is simply dot separated column reference to input struct.

    Definition Classes
    SparkFunctions
  80. def executeNonSelectSQLQueries(sqlList: Seq[String], dbConnection: Connection): Unit

    Permalink
    Definition Classes
    DataHelpers
  81. val extract_mel_dates_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  82. val ff3_encrypt_idwdata: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  83. val file_information: UserDefinedFunction

    Permalink

    UDF to get file information for passed input file path.

    UDF to get file information for passed input file path.

    Definition Classes
    SparkFunctions
  84. def findFirstElement(input: Column, default: Column = lit(null)): Column

    Permalink
    Definition Classes
    SparkFunctions
  85. def findFirstNonBlankElement(input: Column, default: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  86. def findLastElement(input: Column, default: Column = lit(null)): Column

    Permalink
    Definition Classes
    SparkFunctions
  87. def first_defined(expr1: Column, expr2: Column): Column

    Permalink

    Method to identify and return first non null expression.

    Method to identify and return first non null expression.

    Definition Classes
    SparkFunctions
  88. val first_defined_for_double_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  89. def flattenStructSchema(schema: StructType, prefix: String = null): Array[Column]

    Permalink
    Definition Classes
    SparkFunctions
  90. val force_error: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  91. def from_sv(input: Column, separator: String, schema: StructType): Column

    Permalink
    Definition Classes
    SparkFunctions
  92. def from_xml(content: Column, schema: StructType): Column

    Permalink
    Definition Classes
    SparkFunctions
  93. def ftpTo(remoteHost: String, userName: String, password: String, sourceFile: String, destFile: String, retryFailures: Boolean, retryCount: Int, retryPauseSecs: Int, mode: String, psCmd: String): (Boolean, Boolean, String, String)

    Permalink
    Definition Classes
    DataHelpers
  94. def generateDataFrameWithSequenceColumn(start: Int, end: Int, columnName: String, sparkSession: SparkSession): DataFrame

    Permalink

    Method to create dataframe with single column containing increasing sequence id from start to end.

    Method to create dataframe with single column containing increasing sequence id from start to end.

    Definition Classes
    SparkFunctions
  95. def generate_sequence(start: Int, end: Int, step: Int = 1): Column

    Permalink

    Function to create sequence of array between two passed numbers

    Function to create sequence of array between two passed numbers

    start

    starting point of generated sequence

    end

    terminating point of generated sequence.

    returns

    column containing sequence of integers.

    Definition Classes
    SparkFunctions
  96. val generate_sequence: UserDefinedFunction

    Permalink

    UDF to generate column with sequence of integers between two passed start and end columns.

    UDF to generate column with sequence of integers between two passed start and end columns.

    Definition Classes
    SparkFunctions
  97. val getByteFromByteArray: UserDefinedFunction

    Permalink

    UDF to get last Byte from ByteArray of input data.

    UDF to get last Byte from ByteArray of input data.

    Definition Classes
    SparkFunctions
  98. def getColumnInSecondArrayByFirstNonBlankPositionInFirstArray(nonBlankEntryExpr: Column, firstArray: Column, secondArray: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  99. def getContentAsStream(content: String): StringAsStream

    Permalink
    Definition Classes
    SparkFunctions
  100. def getEmptyLogDataFrame(sparkSession: SparkSession): DataFrame

    Permalink

    Method to get empty dataframe with below abinitio log schema.

    Method to get empty dataframe with below abinitio log schema.

    record string("|") node, timestamp, component, subcomponent, event_type; string("|\n") event_text; end

    Definition Classes
    DataHelpers
  101. def getFebruaryDay(year: Column): Column

    Permalink

    Computes number of days in February month in a given year

    Computes number of days in February month in a given year

    year

    year whose number of days in February needs to be calculated

    returns

    number of days

    Definition Classes
    SparkFunctions
  102. def getFieldFromStructByPosition(column: Column, position: Int): Column

    Permalink

    Method to get field at specific position from struct column

    Method to get field at specific position from struct column

    Definition Classes
    SparkFunctions
  103. val getIntFromByteArray: UserDefinedFunction

    Permalink

    UDF to get integer comprising of last 4 Bytes from ByteArray of input data.

    UDF to get integer comprising of last 4 Bytes from ByteArray of input data.

    Definition Classes
    SparkFunctions
  104. val getLongArrayFromByteArray: UserDefinedFunction

    Permalink

    UDF to get long comprising of last 8 Bytes from ByteArray of input data.

    UDF to get long comprising of last 8 Bytes from ByteArray of input data.

    Definition Classes
    SparkFunctions
  105. val getLongFromByteArray: UserDefinedFunction

    Permalink

    UDF to get long comprising of last 8 Bytes from ByteArray of input data.

    UDF to get long comprising of last 8 Bytes from ByteArray of input data.

    Definition Classes
    SparkFunctions
  106. def getMTimeDataframe(filepath: String, format: String, spark: SparkSession): DataFrame

    Permalink
    Definition Classes
    SparkFunctions
  107. val getShortFromByteArray: UserDefinedFunction

    Permalink

    UDF to get short comprising of last 2 Bytes from ByteArray of input data.

    UDF to get short comprising of last 2 Bytes from ByteArray of input data.

    Definition Classes
    SparkFunctions
  108. val get_partial_drug_details_Udf: UserDefinedFunction

    Permalink

    let string("") get_partial_drug_details = get_partial_drug_details_Udf(v_prod_id,v_gpi_no,v_filled_dt,v_mel_thrgh_dt,v_mpa_thrgh_dt,lookup_row("lkp_cag_ndc_gpi",v_cag_sk),for (j, j < count): lookup_row("lkp_ndc_gpi_list",first_defined(lookup_row("lkp_cag_ndc_gpi",v_cag_sk)[j].drug_name_list,'-'))); out :: (string("")) string_concat((string(""))(decimal(""))get_partial_drug_details,'|',(string(""))(decimal("")) v_gf_flg); out :: get_drug_details(v_cag_sk, v_prod_id, v_gpi_no, v_filled_dt, v_mel_thrgh_dt, v_mpa_thrgh_dt,v_prior_auth_nbr)= begin let string("") pattern=first_defined(lookup("lkp_cag_ndc_gpi",v_cag_sk).idntfn_gf,'-'); let string("") flag=first_defined(lookup("lkp_cag_ndc_gpi",v_cag_sk).include_exclude_gf,'-'); let int count = lookup_count("lkp_cag_ndc_gpi",v_cag_sk); let int ndc_count = 0; let int rule_cnt = 0; let int ndc_rule_cnt = 0; let rec_vec v_rec = allocate_with_nulls(); let ndc_rec_vec v_ndc_rec = allocate_with_nulls(); let int drug_match = 0; let string(1) drug_match_ind = '0'; let int claim_flg = 0; let int mbr_flg = 0; let int exact_flg = 0; let int c_valid_rec = 0; let int m_valid_rec = 0; let int v_prior_auth_flg = 0; let int exact_inc_flg = 0; let int exact_exc_flg = 0; let string("") v_rec_drug_name_list = "-"; let int v_gf_flg=0; v_gf_flg= if(flag=='I' and string_index(v_prior_auth_nbr,pattern)!=0)1 else if(flag=='E' and string_index(v_prior_auth_nbr,pattern)!=0)0 else if(flag=='I' and string_index(v_prior_auth_nbr,pattern)==0)0 else if(flag=='E' and string_index(v_prior_auth_nbr,pattern)==0)1 else if(flag=='-' or pattern=='-')2 else 2; while (count > rule_cnt) begin v_rec = lookup_next("lkp_cag_ndc_gpi"); v_rec_drug_name_list = first_defined(v_rec.drug_name_list,'-'); if(string_upcase(v_rec.drug_type) member [vector 'GPI LIST','NDC LIST'] && !is_null(v_rec.drug_name_list)) begin ndc_count = lookup_count("lkp_ndc_gpi_list",v_rec_drug_name_list); ndc_rule_cnt=0; while (ndc_count > ndc_rule_cnt) begin v_ndc_rec = lookup_next("lkp_ndc_gpi_list"); drug_match = if(is_null(v_rec.drug_type) || is_blank(v_rec.drug_type) || first_defined(v_rec.drug_type,'-') == '-' || (string_upcase(v_rec.drug_type) == 'GPI LIST' && (starts_with(string_downcase(v_gpi_no), string_downcase(v_ndc_rec.ndc_gpi)) == 1 || v_ndc_rec.list_name == '-')) || (string_upcase(v_rec.drug_type) == 'NDC LIST' && (starts_with(string_downcase(v_prod_id), string_downcase(v_ndc_rec.ndc_gpi)) == 1 || v_ndc_rec.list_name == '-'))) 1 else 0;

    let string("") get_partial_drug_details = get_partial_drug_details_Udf(v_prod_id,v_gpi_no,v_filled_dt,v_mel_thrgh_dt,v_mpa_thrgh_dt,lookup_row("lkp_cag_ndc_gpi",v_cag_sk),for (j, j < count): lookup_row("lkp_ndc_gpi_list",first_defined(lookup_row("lkp_cag_ndc_gpi",v_cag_sk)[j].drug_name_list,'-'))); out :: (string("")) string_concat((string(""))(decimal(""))get_partial_drug_details,'|',(string(""))(decimal("")) v_gf_flg); out :: get_drug_details(v_cag_sk, v_prod_id, v_gpi_no, v_filled_dt, v_mel_thrgh_dt, v_mpa_thrgh_dt,v_prior_auth_nbr)= begin let string("") pattern=first_defined(lookup("lkp_cag_ndc_gpi",v_cag_sk).idntfn_gf,'-'); let string("") flag=first_defined(lookup("lkp_cag_ndc_gpi",v_cag_sk).include_exclude_gf,'-'); let int count = lookup_count("lkp_cag_ndc_gpi",v_cag_sk); let int ndc_count = 0; let int rule_cnt = 0; let int ndc_rule_cnt = 0; let rec_vec v_rec = allocate_with_nulls(); let ndc_rec_vec v_ndc_rec = allocate_with_nulls(); let int drug_match = 0; let string(1) drug_match_ind = '0'; let int claim_flg = 0; let int mbr_flg = 0; let int exact_flg = 0; let int c_valid_rec = 0; let int m_valid_rec = 0; let int v_prior_auth_flg = 0; let int exact_inc_flg = 0; let int exact_exc_flg = 0; let string("") v_rec_drug_name_list = "-"; let int v_gf_flg=0; v_gf_flg= if(flag=='I' and string_index(v_prior_auth_nbr,pattern)!=0)1 else if(flag=='E' and string_index(v_prior_auth_nbr,pattern)!=0)0 else if(flag=='I' and string_index(v_prior_auth_nbr,pattern)==0)0 else if(flag=='E' and string_index(v_prior_auth_nbr,pattern)==0)1 else if(flag=='-' or pattern=='-')2 else 2; while (count > rule_cnt) begin v_rec = lookup_next("lkp_cag_ndc_gpi"); v_rec_drug_name_list = first_defined(v_rec.drug_name_list,'-'); if(string_upcase(v_rec.drug_type) member [vector 'GPI LIST','NDC LIST'] && !is_null(v_rec.drug_name_list)) begin ndc_count = lookup_count("lkp_ndc_gpi_list",v_rec_drug_name_list); ndc_rule_cnt=0; while (ndc_count > ndc_rule_cnt) begin v_ndc_rec = lookup_next("lkp_ndc_gpi_list"); drug_match = if(is_null(v_rec.drug_type) || is_blank(v_rec.drug_type) || first_defined(v_rec.drug_type,'-') == '-' || (string_upcase(v_rec.drug_type) == 'GPI LIST' && (starts_with(string_downcase(v_gpi_no), string_downcase(v_ndc_rec.ndc_gpi)) == 1 || v_ndc_rec.list_name == '-')) || (string_upcase(v_rec.drug_type) == 'NDC LIST' && (starts_with(string_downcase(v_prod_id), string_downcase(v_ndc_rec.ndc_gpi)) == 1 || v_ndc_rec.list_name == '-'))) 1 else 0;

    claim_flg = if (date_difference_days((date("YYYYMMDD")) (datetime("YYYYMMDD")) now(), (date("YYYYMMDD")) v_filled_dt) <= (int) (decimal("")) v_rec.lookback_days) 1 else 0; mbr_flg = if (date_difference_days((date("YYYYMMDD")) v_mel_thrgh_dt, (date("YYYYMMDD")) (datetime("YYYYMMDD")) now()) >= (int) (decimal("")) v_rec.mel_lookforward) 1 else 0;

    c_valid_rec = if (v_rec.include_exclude == 'I' && exact_inc_flg == 0 && exact_exc_flg == 0) if (drug_match == 0) 0 else if (drug_match == 1 && claim_flg == 1) 1 else 0 else if (v_rec.include_exclude == 'E' && exact_exc_flg == 0 && exact_inc_flg == 1) if (drug_match == 1 && claim_flg == 1) 0 else 1 else if (v_rec.include_exclude == 'E' && exact_exc_flg == 0) if (drug_match == 1 && claim_flg == 1) 0 else if (drug_match == 1 && claim_flg == 0 && (rule_cnt != 0 && c_valid_rec == 1)) 1 else if (drug_match == 0 && claim_flg == 1 && ((rule_cnt != 0 && c_valid_rec == 1) || rule_cnt == 0)) 1 else 0 else c_valid_rec; m_valid_rec = if (v_rec.include_exclude == 'I' && exact_inc_flg == 0 && exact_exc_flg == 0) if (drug_match == 0) 0 else if (drug_match == 1 && mbr_flg == 1) 1 else 0 else if (v_rec.include_exclude == 'E' && exact_exc_flg == 0 && exact_inc_flg == 1) if (drug_match == 1 && mbr_flg == 1) 0 else 1 else if (v_rec.include_exclude == 'E' && exact_exc_flg == 0) if (drug_match == 1 && mbr_flg == 1) 0 else if (drug_match == 1 && mbr_flg == 0 && (rule_cnt != 0 && m_valid_rec == 1)) 1 else if (drug_match == 0 && mbr_flg == 1 && ((rule_cnt != 0 && m_valid_rec == 1) || rule_cnt == 0)) 1 else 0 else m_valid_rec; exact_inc_flg = if (v_rec.include_exclude == 'I' && drug_match == 1) 1 else exact_inc_flg; exact_exc_flg = if (v_rec.include_exclude == 'E' && drug_match == 1) 1 else exact_exc_flg; v_prior_auth_flg = if ((v_prior_auth_flg == 0 && date_difference_days((date("YYYYMMDD")) v_mpa_thrgh_dt, (date("YYYYMMDD")) (datetime("YYYYMMDD")) now()) == (int) (decimal("")) v_rec.pa_lookforward) || v_prior_auth_flg == 1) 1 else 0; ndc_rule_cnt = ndc_rule_cnt + 1; end end else begin drug_match = if(is_null(v_rec.drug_type) || is_blank(v_rec.drug_type) || first_defined(v_rec.drug_type,'-') == '-' || (string_upcase(v_rec.drug_type) member [vector 'GPI LIST', 'NDC LIST'] && v_rec.drug_name_list == '-') || (string_upcase(v_rec.drug_type) == 'GPI' && (starts_with(string_downcase(v_gpi_no), string_downcase(v_rec.drug_name_list)) == 1 || v_rec.drug_name_list == '-')) || (string_upcase(v_rec.drug_type) == 'NDC' && (starts_with(string_downcase(v_prod_id), string_downcase(v_rec.drug_name_list)) == 1 || v_rec.drug_name_list == '-'))) 1 else 0;

    claim_flg = if (date_difference_days((date("YYYYMMDD")) (datetime("YYYYMMDD")) now(), (date("YYYYMMDD")) v_filled_dt) <= (int) (decimal("")) v_rec.lookback_days) 1 else 0;

    mbr_flg = if (date_difference_days((date("YYYYMMDD")) v_mel_thrgh_dt, (date("YYYYMMDD")) (datetime("YYYYMMDD")) now()) >= (int) (decimal("")) v_rec.mel_lookforward) 1 else 0;

    c_valid_rec = if (v_rec.include_exclude == 'I' && exact_inc_flg == 0 && exact_exc_flg == 0) if (drug_match == 0) 0 else if (drug_match == 1 && claim_flg == 1) 1 else 0 else if (v_rec.include_exclude == 'E' && exact_exc_flg == 0 && exact_inc_flg == 1) if (drug_match == 1 && claim_flg == 1) 0 else 1 else if (v_rec.include_exclude == 'E' && exact_exc_flg == 0) if (drug_match == 1 && claim_flg == 1) 0 else if (drug_match == 1 && claim_flg == 0 && (rule_cnt != 0 && c_valid_rec == 1)) 1 else if (drug_match == 0 && claim_flg == 1 && ((rule_cnt != 0 && c_valid_rec == 1) || rule_cnt == 0)) 1 else 0 else c_valid_rec; m_valid_rec = if (v_rec.include_exclude == 'I' && exact_inc_flg == 0 && exact_exc_flg == 0) if (drug_match == 0) 0 else if (drug_match == 1 && mbr_flg == 1) 1 else 0 else if (v_rec.include_exclude == 'E' && exact_exc_flg == 0 && exact_inc_flg == 1) if (drug_match == 1 && mbr_flg == 1) 0 else 1 else if (v_rec.include_exclude == 'E' && exact_exc_flg == 0) if (drug_match == 1 && mbr_flg == 1) 0 else if (drug_match == 1 && mbr_flg == 0 && (rule_cnt != 0 && m_valid_rec == 1)) 1 else if (drug_match == 0 && mbr_flg == 1 && ((rule_cnt != 0 && m_valid_rec == 1) || rule_cnt == 0)) 1 else 0 else m_valid_rec; exact_inc_flg = if (v_rec.include_exclude == 'I' && drug_match == 1) 1 else exact_inc_flg; exact_exc_flg = if (v_rec.include_exclude == 'E' && drug_match == 1) 1 else exact_exc_flg; v_prior_auth_flg = if ((v_prior_auth_flg == 0 && date_difference_days((date("YYYYMMDD")) v_mpa_thrgh_dt, (date("YYYYMMDD")) (datetime("YYYYMMDD")) now()) == (int) (decimal("")) v_rec.pa_lookforward) || v_prior_auth_flg == 1) 1 else 0; end rule_cnt = rule_cnt + 1; end

    out :: (string("")) string_concat((string(""))(decimal(""))c_valid_rec,'|',(string(""))(decimal("")) m_valid_rec,'|',(string(""))(decimal("")) v_prior_auth_flg,'|',(string(""))(decimal("")) v_gf_flg); end;

    Definition Classes
    SparkFunctions
  109. def hashCode(): Int

    Permalink
    Definition Classes
    Any
  110. val hash_MD5: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  111. val instr_udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  112. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  113. def isNullOrEmpty(input: Column): Column

    Permalink

    Method to check if current column is null or has empty value.

    Method to check if current column is null or has empty value.

    Definition Classes
    SparkFunctions
  114. def is_ascii(input: Column): Column

    Permalink

    Checks if a string is ascii

    Checks if a string is ascii

    input

    column to be checked

    returns

    true if the input string is ascii otherwise false

    Definition Classes
    SparkFunctions
  115. def is_blank(input: Column): Column

    Permalink

    Method to identify if input string is a blank string or not.

    Method to identify if input string is a blank string or not.

    input

    input string.

    returns

    return 1 if given string contains all blank character or is a zero length string, otherwise it returns 0

    Definition Classes
    SparkFunctions
  116. val is_blank_udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  117. val is_bzero: UserDefinedFunction

    Permalink

    Tests whether an object is composed of all binary zero bytes.

    Tests whether an object is composed of all binary zero bytes. This function returns: 1. 1 if obj contains only binary zero bytes or is a zero-length string 2. 0 if obj contains any non-zero bytes 3. NULL if obj is NULL

    Definition Classes
    SparkFunctions
  118. def is_numeric_ascii(input: Column): Column

    Permalink

    Checks if an input string contains only ascii code and numbers

    Checks if an input string contains only ascii code and numbers

    input

    string to be checked

    returns

    true if input string contains only ascii code and numbers or null if input is null

    Definition Classes
    SparkFunctions
  119. def is_valid(input: Column, isNullable: Boolean, formatInfo: Option[Any], len: Option[Seq[Int]]): Column

    Permalink

    Method to identify if passed input column is a valid expression after typecasting to passed dataType.

    Method to identify if passed input column is a valid expression after typecasting to passed dataType. Also while typecasting if len is present then this function also makes sure the max length of input column after typecasting operation is not greater than len.

    input

    input column expression to be identified if is valid.

    formatInfo

    datatype to which input column expression must be typecasted. If datatype is a string then it is treated as timestamp format. If it is a list of string then it is treated as having current timestamp format and and new timestamp format to which input column needs to be typecasted.

    len

    max length of input column after typecasting it to dataType.

    returns

    0 if input column is not valid after typecasting or 1 if it is valid.

    Definition Classes
    SparkFunctions
  120. def is_valid(input: Column, isNullable: Boolean, formatInfo: Option[Any]): Column

    Permalink
    Definition Classes
    SparkFunctions
  121. def is_valid(input: Column, formatInfo: Option[Any], len: Option[Seq[Int]]): Column

    Permalink
    Definition Classes
    SparkFunctions
  122. def is_valid(input: Column, formatInfo: Option[Any]): Column

    Permalink
    Definition Classes
    SparkFunctions
  123. def is_valid(input: Column, isNullable: Boolean): Column

    Permalink
    Definition Classes
    SparkFunctions
  124. def is_valid(input: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  125. def is_valid_date(dateFormat: String, inDate: Column): Column

    Permalink

    Validates date against a input format

    Validates date against a input format

    dateFormat

    A pattern such as yyyy-MM-dd or yyyy-MM-dd HH:mm:ss.SSSS or dd.MM.yyyy

    inDate

    Input date to be validated

    returns

    true if the input date is valid otherwise false

    Definition Classes
    SparkFunctions
  126. package jsonrpc

    Permalink
  127. def lastElementInCurrentWindow(input: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  128. package lineage

    Permalink
  129. def loadBinaryFileAsBinaryDataFrame(filePath: String, lineDelimiter: String = "\n", minPartition: Int = 1, rowName: String = "line", spark: SparkSession): DataFrame

    Permalink
    Definition Classes
    DataHelpers
  130. def loadBinaryFileAsStringDataFrame(filePath: String, lineDelimiter: String = "\n", charSetEncoding: String = "Cp1047", minPartition: Int = 1, rowName: String = "line", spark: SparkSession): DataFrame

    Permalink
    Definition Classes
    DataHelpers
  131. def loadFixedWindowBinaryFileAsDataFrame(filePath: String, lineLength: Int, minPartition: Int = 1, rowName: String = "line", spark: SparkSession): DataFrame

    Permalink
    Definition Classes
    DataHelpers
  132. lazy val logger: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    LazyLogging
  133. def lookup(lookupName: String, cols: Column*): Column

    Permalink

    By default returns only the first matching record

    By default returns only the first matching record

    Definition Classes
    UDFUtils
  134. def lookup_count(lookupName: String, cols: Column*): Column

    Permalink
    Definition Classes
    UDFUtils
  135. def lookup_last(lookupName: String, cols: Column*): Column

    Permalink

    Returns the last matching record

    Returns the last matching record

    Definition Classes
    UDFUtils
  136. def lookup_match(lookupName: String, cols: Column*): Column

    Permalink

    returns

    Boolean Column

    Definition Classes
    UDFUtils
  137. def lookup_nth(lookupName: String, cols: Column*): Column

    Permalink
    Definition Classes
    UDFUtils
  138. def lookup_range(lookupName: String, input: Column): Column

    Permalink
    Definition Classes
    UDFUtils
  139. def lookup_row(lookupName: String, cols: Column*): Column

    Permalink
    Definition Classes
    UDFUtils
  140. def lookup_row_reverse(lookupName: String, cols: Column*): Column

    Permalink
    Definition Classes
    UDFUtils
  141. val make_byte_flags: UserDefinedFunction

    Permalink

    UDF to return a flag for each character if it is present or not in input String.

    UDF to return a flag for each character if it is present or not in input String.

    Definition Classes
    SparkFunctions
  142. def make_constant_vector(size: Int, seedVal: Int): Array[Int]

    Permalink

    Method to create array of size "size" containing seedVal as each entry

    Method to create array of size "size" containing seedVal as each entry

    Definition Classes
    SparkFunctions
  143. def make_constant_vector(size: Int, seedVal: Column): Column

    Permalink

    Method to create array of size "size" containing seedVal as each entry

    Method to create array of size "size" containing seedVal as each entry

    Definition Classes
    SparkFunctions
  144. def measure[T](fn: ⇒ T)(caller: String = findCaller()): T

    Permalink
    Definition Classes
    UDFUtils
  145. val member_elig_rec_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  146. val multi_regex_match: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  147. def multi_regex_replace_with_char_conversion(input: Column, charSet: Column, replaceStr: Column, replacement0: String, replacement1: String, pattern: String*): Column

    Permalink
    Definition Classes
    SparkFunctions
  148. val multifile_information: UserDefinedFunction

    Permalink

    UDF to get multifile information for passed input file path.

    UDF to get multifile information for passed input file path.

    Definition Classes
    SparkFunctions
  149. val murmur: UserDefinedFunction

    Permalink

    UDF for murmur hash generation for any column type

    UDF for murmur hash generation for any column type

    Definition Classes
    SparkFunctions
  150. def now(): Column

    Permalink

    Method to get current timestamp.

    Method to get current timestamp.

    returns

    current timestamp in YYYYMMddHHmmssSSSSSS format.

    Definition Classes
    SparkFunctions
  151. def numberOfPartitions(in: DataFrame): Column

    Permalink
    Definition Classes
    SparkFunctions
  152. val number_grouping: UserDefinedFunction

    Permalink

    udf to group input decimal into multiple groups separated by separator

    udf to group input decimal into multiple groups separated by separator

    Definition Classes
    SparkFunctions
  153. val packedBytesStringToDecimal: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  154. val packedBytesToDecimal: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  155. package python

    Permalink
  156. val re_get_match: UserDefinedFunction

    Permalink

    Returns the first string in a target string that matches a regular expression.

    Returns the first string in a target string that matches a regular expression.

    Definition Classes
    SparkFunctions
  157. val re_get_match_with_index: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  158. val re_index: UserDefinedFunction

    Permalink

    UDF wrapper over re_index function.

    UDF wrapper over re_index function.

    Definition Classes
    SparkFunctions
  159. val re_index_with_offset: UserDefinedFunction

    Permalink

    Returns the first string in a target string that matches a regular expression.

    Returns the first string in a target string that matches a regular expression.

    Definition Classes
    SparkFunctions
  160. def re_replace(target: Column, pattern: String, replacement: String, offset: Int = 0): Column

    Permalink

    Replaces all substrings in a target string that match a specified regular expression.

    Replaces all substrings in a target string that match a specified regular expression.

    target

    A string that the function searches for a substring that matches pattern_expr.

    pattern

    regular expression

    replacement

    replacement string

    offset

    Number of characters, from the beginning of str, to skip before searching.

    returns

    a replaced string in which all substrings, which matches a specified regular expression, are replaced.

    Definition Classes
    SparkFunctions
  161. def re_replace_first(target: Column, pattern: String, replacement: String, offset: Column = lit(0)): Column

    Permalink

    Replaces only the first regex matching occurrence in the target string.

    Replaces only the first regex matching occurrence in the target string.

    target

    A string that the function searches for a substring that matches pattern_expr.

    pattern

    regular expression

    replacement

    replacement string

    returns

    a replaced string in which first substring, which matches a specified regular expression, is replaced.

    Definition Classes
    SparkFunctions
  162. val re_split_no_empty: UserDefinedFunction

    Permalink

    UDF to split input string via pattern string and remove all empty subtrings.

    UDF to split input string via pattern string and remove all empty subtrings.

    Definition Classes
    SparkFunctions
  163. val readBytesIntoInteger: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  164. val readBytesIntoLong: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  165. val readBytesStringIntoInteger: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  166. val readBytesStringIntoLong: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  167. def readHiveTable(spark: SparkSession, database: String, table: String, partition: String = ""): DataFrame

    Permalink

    Method to read data from hive table.

    Method to read data from hive table.

    spark

    spark session

    database

    hive database

    table

    hive table.

    partition

    hive table partition to read data specifically from if provided.

    returns

    dataframe with data read from Hive Table.

    Definition Classes
    DataHelpers
  168. def readHiveTableInChunks(spark: SparkSession, database: String, table: String, partitionKey: String, partitionValue: String): DataFrame

    Permalink

    Reads a full hive table partition, by reading every subpartition separately and performing a union on all the final DataFrames

    Reads a full hive table partition, by reading every subpartition separately and performing a union on all the final DataFrames

    This function is meant to temporarily solve the problem with Hive metastore crashing when querying too many partitions at the same time.

    spark

    spark session

    database

    hive database name

    table

    hive table name

    partitionKey

    top-level partition's key

    partitionValue

    top-level partition's value

    returns

    A complete DataFrame with the selected hive table partition

    Definition Classes
    DataHelpers
  169. val read_file: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  170. val record_info: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  171. val record_info_with_includes: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  172. def registerAllUDFs(spark: SparkSession): Unit

    Permalink
    Definition Classes
    SparkFunctions
  173. def registerProphecyUdfs(spark: SparkSession): Unit

    Permalink
    Definition Classes
    UDFUtils
  174. def register_output_schema(portName: String, schema: StructType): Unit

    Permalink
    Definition Classes
    Component
  175. def remove_non_digit(input: Column): Column

    Permalink

    Method removes any non-digit characters from the specified string column.

    Method removes any non-digit characters from the specified string column.

    input

    input String Column

    returns

    Cleaned string column or null

    Definition Classes
    SparkFunctions
  176. def replaceBlankColumnWithNull(input: Column): Column

    Permalink

    Method to replace String Columns with Empty value to Null.

    Method to replace String Columns with Empty value to Null.

    Definition Classes
    SparkFunctions
  177. def replaceString(sparkSession: SparkSession, df: DataFrame, outputCol: String, inputCol: String, replaceWith: String, value: String, values: String*): DataFrame

    Permalink

    Function to add new column in passed dataframe.

    Function to add new column in passed dataframe. Newly added column value is decided by the presence of value corresponding to inputCol in array comprised of value and values. If inputCol is found then value of replaceWith is added in new column otherwise inputCol value is added.

    sparkSession

    spark session.

    df

    input dataframe.

    outputCol

    name of new column to be added.

    inputCol

    column name whose value is searched.

    replaceWith

    value with which to replace searched value if found.

    value

    element to be combined in array column

    values

    all values to be combined in array column for searching purpose.

    returns

    dataframe with new column with column name outputCol

    Definition Classes
    UDFUtils
  178. def replaceStringNull(sparkSession: SparkSession, df: DataFrame, outputCol: String, inputCol: String, replaceWith: String, value: String, values: String*): DataFrame

    Permalink

    Function to add new column in passed dataframe.

    Function to add new column in passed dataframe. Newly added column value is decided by the presence of value corresponding to inputCol in array comprised of value and values and null. If inputCol is found then value of replaceWith is added in new column otherwise inputCol value is added.

    sparkSession

    spark session.

    df

    input dataframe.

    outputCol

    name of new column to be added.

    inputCol

    column name whose value is searched.

    replaceWith

    value with which to replace searched value if found.

    value

    element to be combined in array column

    values

    all values to be combined in array column for searching purpose.

    returns

    dataframe with new column with column name outputCol

    Definition Classes
    UDFUtils
  179. def replaceStringWithNull(sparkSession: SparkSession, df: DataFrame, outputCol: String, inputCol: String, value: String, values: String*): DataFrame

    Permalink

    Function to add new column in passed dataframe.

    Function to add new column in passed dataframe. Newly added column value is decided by the presence of value corresponding to inputCol in array comprised of value and values and null. If inputCol is found then value of null is added in new column otherwise inputCol value is added.

    sparkSession

    spark session.

    df

    input dataframe.

    outputCol

    name of new Column to be added.

    inputCol

    column name whose value is searched.

    value

    element to be combined in array column.

    values

    all values to be combined in array column for searching purpose.

    returns

    dataframe with new column with column name outputCol

    Definition Classes
    UDFUtils
  180. def replace_null_with_blank(input: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  181. val replace_string: UserDefinedFunction

    Permalink

    UDF to find str in input sequence toBeReplaced and return replace if found.

    UDF to find str in input sequence toBeReplaced and return replace if found. Otherwise str is returned.

    Definition Classes
    UDFUtils
  182. val replace_string_with_null: UserDefinedFunction

    Permalink

    UDF to find str in input sequence toBeReplaced and return null if found.

    UDF to find str in input sequence toBeReplaced and return null if found. Otherwise str is returned.

    Definition Classes
    UDFUtils
  183. def scanf_double(format: Column, value: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  184. def scanf_long(format: Column, value: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  185. def schemaRowCompareResult(row1: StructType, row2: StructType): Column

    Permalink
    Definition Classes
    SparkFunctions
  186. def sign_explicit(c: Column): Column

    Permalink

    Adds an explicit sign to the number.

    Adds an explicit sign to the number. E.g. 2 -> +2; -004 -> -004; 0 -> +0

    Definition Classes
    SparkFunctions
  187. val sign_explicit_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  188. def sign_reserved(c: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  189. val sign_reserved_Udf: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  190. package sources

    Permalink
  191. def splitIntoMultipleColumns(sparkSession: SparkSession, df: DataFrame, colName: String, pattern: String, prefix: String = null): DataFrame

    Permalink

    Function to split column with colName in input dataframe using split pattern into multiple columns.

    Function to split column with colName in input dataframe using split pattern into multiple columns. If prefix name is provided each new generated column is prefixed with prefix followed by column number, otherwise original column name is used.

    sparkSession

    spark session.

    df

    input dataframe.

    colName

    column in dataframe which needs to be split into multiple columns.

    pattern

    regex with which column in input dataframe will be split into multiple columns.

    prefix

    column prefix to be used with all newly generated columns.

    returns

    new dataframe with new columns where new column values are generated after splitting original column colName.

    Definition Classes
    UDFUtils
  192. val splitIntoMultipleColumnsUdf: UserDefinedFunction

    Permalink

    UDF to break input string into multiple string via delimiter.

    UDF to break input string into multiple string via delimiter. Number of strings after split are adjusted as per passed width parameter. If number of strings are less then empty strings are added otherwise in case of more number of strings, first width number of entries are picked and remaining are discarded.

    Definition Classes
    SparkFunctions
  193. def starts_with(input: Column, prefix: String): Column

    Permalink

    Returns true if string columns starts with given prefix

    Returns true if string columns starts with given prefix

    Definition Classes
    SparkFunctions
  194. def string_char(inputStr: Column, index: Int): Column

    Permalink

    Method to return character code of character at index position in inputStr string.

    Method to return character code of character at index position in inputStr string.

    inputStr

    input string

    index

    location of character to get code.

    returns

    integer column.

    Definition Classes
    SparkFunctions
  195. val string_cleanse: UserDefinedFunction

    Permalink

    This implementation is incorrect.

    This implementation is incorrect.

    Definition Classes
    SparkFunctions
  196. def string_compare(input1: Column, input2: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  197. val string_concat_in_loop: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  198. val string_convert_explicit: UserDefinedFunction

    Permalink

    Converts a string from one character set to another, replacing inconvertible characters with a specified string.

    Converts a string from one character set to another, replacing inconvertible characters with a specified string.

    Definition Classes
    SparkFunctions
  199. val string_filter: UserDefinedFunction

    Permalink

    Method which returns string of characters present in both of the strings in the same order as appearing in first string

    Method which returns string of characters present in both of the strings in the same order as appearing in first string

    Definition Classes
    SparkFunctions
  200. val string_filter_out: UserDefinedFunction

    Permalink

    Compares two input strings, then returns characters that appear in one string but not in the other.

    Compares two input strings, then returns characters that appear in one string but not in the other.

    Definition Classes
    SparkFunctions
  201. val string_from_hex: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  202. val string_index: UserDefinedFunction

    Permalink

    UDF to find index of seekStr in inputStr.

    UDF to find index of seekStr in inputStr. Returned index will be 1 based index.

    Definition Classes
    SparkFunctions
  203. val string_index_with_offset: UserDefinedFunction

    Permalink

    UDF to find index of seekStr in inputStr from offset index onwards.

    UDF to find index of seekStr in inputStr from offset index onwards. Returned string position is 1 based position.

    Definition Classes
    SparkFunctions
  204. def string_is_alphabetic(input: Column): Column

    Permalink

    Method which returns true if input string contains all alphabetic characters, or false otherwise.

    Method which returns true if input string contains all alphabetic characters, or false otherwise.

    Definition Classes
    SparkFunctions
  205. def string_is_numeric(input: Column): Column

    Permalink

    Method which returns true if input string contains all numeric characters, or false otherwise.

    Method which returns true if input string contains all numeric characters, or false otherwise.

    Definition Classes
    SparkFunctions
  206. def string_join(column: Column, delimiter: String): Column

    Permalink

    Concatenates the elements of column using the delimiter.

    Concatenates the elements of column using the delimiter.

    Definition Classes
    SparkFunctions
  207. def string_length(input: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  208. val string_like: UserDefinedFunction

    Permalink

    Method to test whether a string matches a specified pattern.

    Method to test whether a string matches a specified pattern. This function returns 1 if the input string matches a specified pattern, and 0 if the string does not match the pattern.

    In abinitio version % character in pattern means to match zero or more characters and _ character means matches a single character.

    Definition Classes
    SparkFunctions
  209. def string_lpad(input: Column, len: Int, pad_char: String = " "): Column

    Permalink

    Left-pad the input string column with pad_char to a length of len.

    Left-pad the input string column with pad_char to a length of len. If length of input column is more than len then returns input column unmodified.

    Definition Classes
    SparkFunctions
  210. def string_lrepad(input: Column, len: Int, char_to_pad_with: String = " "): Column

    Permalink

    function trims the string and then pad the string with given character upto given length.

    function trims the string and then pad the string with given character upto given length. if the length of trimmed string is equal to or greater than given length than it return input string

    input

    input string

    len

    length in number of characters.

    char_to_pad_with

    A character used to pad input string to length len.

    returns

    string of a specified length, trimmed of leading and trailing blanks and left-padded with a given character.

    Definition Classes
    SparkFunctions
  211. def string_pad(input: Column, len: Int, char_to_pad_with: String = " "): Column

    Permalink

    function pads input on the right with the character char_to_pad_with to make the string length len.

    function pads input on the right with the character char_to_pad_with to make the string length len. If str is already len or more characters long, the function returns input unmodified.

    Definition Classes
    SparkFunctions
  212. val string_pad: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  213. val string_pad_with_char: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  214. def string_prefix(input: Column, length: Column): Column

    Permalink
    Definition Classes
    SparkFunctions
  215. def string_repad(input: Column, len: Int, char_to_pad_with: String = " "): Column

    Permalink

    function trims the string and then pad the string on right side with given character upto given length.

    function trims the string and then pad the string on right side with given character upto given length. if the length of trimmed string is equal to or greater than given length than it return input string

    input

    input string

    len

    length in number of characters.

    char_to_pad_with

    A character used to pad input string to length len.

    returns

    string of a specified length, trimmed of leading and trailing blanks and left-padded with a given character.

    Definition Classes
    SparkFunctions
  216. def string_replace(input: Column, seekStr: Column, newStr: Column, offset: Column = lit(0)): Column

    Permalink

    Function to replace occurrence of seekStr with newStr string in input string after offset characters from first character.

    Function to replace occurrence of seekStr with newStr string in input string after offset characters from first character.

    input

    input string on which to perform replace operation.

    seekStr

    string to be replaced in input string.

    newStr

    string to be used instead of seekStr in input string.

    offset

    number of characters to skip from begining in input string before performing string_replace operation.

    returns

    modified string where seekStr is replaced with newStr in input string.

    Definition Classes
    SparkFunctions
  217. val string_replace_first: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  218. val string_replace_first_in_loop: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  219. val string_replace_in_loop: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  220. val string_representation: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  221. val string_rindex: UserDefinedFunction

    Permalink

    Returns the index of the first character of the last occurrence of a seek string within another input string.

    Returns the index of the first character of the last occurrence of a seek string within another input string. Returned index is 1 based.

    Definition Classes
    SparkFunctions
  222. val string_rindex_with_offset: UserDefinedFunction

    Permalink

    UDF to find index of seekStr in inputStr from end of inputStr skipping offset number of characters from end.

    UDF to find index of seekStr in inputStr from end of inputStr skipping offset number of characters from end. Offset index is number of characters, from the end of str, to skip before searching. Returned string position is 1 based position.

    Definition Classes
    SparkFunctions
  223. val string_split: UserDefinedFunction

    Permalink

    UDF to split input string via delimiter string.

    UDF to split input string via delimiter string.

    Definition Classes
    SparkFunctions
  224. val string_split_no_empty: UserDefinedFunction

    Permalink

    UDF to split input string via delimiter string and remove all empty subtrings.

    UDF to split input string via delimiter string and remove all empty subtrings.

    Definition Classes
    SparkFunctions
  225. def string_substring(input: Column, start_position: Column, length: Column): Column

    Permalink

    Method to find substring of input string.

    Method to find substring of input string.

    input

    string on which to find substring.

    start_position

    1 based starting position to find substring from.

    length

    total length of substring to be found.

    returns

    substring of input string

    Definition Classes
    SparkFunctions
  226. def string_suffix(input: Column, len: Int): Column

    Permalink
    Definition Classes
    SparkFunctions
  227. val string_to_hex: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  228. val sv_apply: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  229. val sv_create_collection: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  230. val take_last_nth: UserDefinedFunction

    Permalink

    UDF to return nth element from last in passed array of elements.

    UDF to return nth element from last in passed array of elements. In case input sequence has less number of elements than n then first element is returned.

    Definition Classes
    UDFUtils
  231. val take_nth: UserDefinedFunction

    Permalink

    UDF to take Nth element from beginning.

    UDF to take Nth element from beginning. In case input sequence has less element than N then exception is thrown.

    Definition Classes
    UDFUtils
  232. val test_characters_all: UserDefinedFunction

    Permalink

    UDF to identify the number of characters in inputStr which are present in charFlag

    UDF to identify the number of characters in inputStr which are present in charFlag

    Definition Classes
    SparkFunctions
  233. def timezone_to_utc(timezone: String, time: Column): Column

    Permalink

    Method to convert

    Method to convert

    Definition Classes
    SparkFunctions
  234. def toString(): String

    Permalink
    Definition Classes
    Any
  235. def today(): Column

    Permalink

    Method to return integer value representing number of days to today from “1-1-1990”.

    Method to return integer value representing number of days to today from “1-1-1990”.

    returns

    integer value

    Definition Classes
    SparkFunctions
  236. val translate_bytes: UserDefinedFunction

    Permalink

    UDF to return a string in the native character set made up of bytes from the given map.

    UDF to return a string in the native character set made up of bytes from the given map. Each byte of the result is the value of map indexed by the character code of the corresponding byte of the input string str. The function returns NULL if any argument is NULL.

    Definition Classes
    SparkFunctions
  237. val truncateMicroSeconds: UserDefinedFunction

    Permalink

    UDF to truncate microseconds part of timestamp.

    UDF to truncate microseconds part of timestamp. This is needed as abinitio and spark has some incompatibility in microseconds part of timestamp format.

    Definition Classes
    SparkFunctions
  238. val type_info: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  239. val type_info_with_includes: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  240. def unionAll(df: DataFrame*): DataFrame

    Permalink

    Method to take union of all passed dataframes.

    Method to take union of all passed dataframes.

    df

    list of dataframes for which to take union of.

    returns

    union of all passed input dataframes.

    Definition Classes
    DataHelpers
  241. val unique_identifier: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  242. package unittesting

    Permalink
  243. val url_encode_escapes: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  244. package utils

    Permalink
  245. def withSparkProperty[T](key: String, value: String, spark: SparkSession)(body: ⇒ T): T

    Permalink
  246. def withSubgraphName[T](value: String, spark: SparkSession)(body: ⇒ T): T

    Permalink
  247. def withTargetId[T](value: String, spark: SparkSession)(body: ⇒ T): T

    Permalink
  248. def writeDataFrame(df: DataFrame, path: String, spark: SparkSession, props: Map[String, String], format: String, partitionColumns: List[String] = Nil, bucketColumns: List[String] = Nil, numBuckets: Option[Int] = None, sortColumns: List[String] = Nil, tableName: Option[String] = None, databaseName: Option[String] = None): Unit

    Permalink

    Method to write data passed in dataframe in specific file format.

    Method to write data passed in dataframe in specific file format.

    df

    dataframe containing data.

    path

    path to write data to.

    spark

    spark session.

    props

    underlying data source specific properties.

    format

    file format in which to persist data. Supported file formats are csv, text, json, parquet, orc

    partitionColumns

    columns to be used for partitioning.

    bucketColumns

    used to bucket the output by the given columns. If specified, the output is laid out on the file-system similar to Hive's bucketing scheme.

    numBuckets

    number of buckets to be used.

    sortColumns

    columns on which to order data while persisting.

    tableName

    table name for persisting data.

    databaseName

    database name for persisting data.

    Definition Classes
    DataHelpers
  249. val writeIntegerToBytes: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  250. val writeLongToBytes: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  251. lazy val write_to_log: UserDefinedFunction

    Permalink

    UDF to write logging parameters to log port.

    UDF to write logging parameters to log port.

    Definition Classes
    DataHelpers
  252. val xmlToJSON: UserDefinedFunction

    Permalink
    Definition Classes
    SparkFunctions
  253. def yyyyMMdd_to_YYYYJJJ(in_date: Column): Column

    Permalink

    Converts yyyyyMMdd to YYYYJJJ

    Converts yyyyyMMdd to YYYYJJJ

    in_date

    date in yyyyMMdd format

    returns

    a date converted to YYYYJJJ

    Definition Classes
    SparkFunctions
  254. def zip_eventInfo_arrays(column1: Column, column2: Column): Column

    Permalink

    Method to zip two arrays with first one having event_type and second one having event_text

    Method to zip two arrays with first one having event_type and second one having event_text

    Definition Classes
    SparkFunctions

Inherited from Extension

Inherited from FixedFileFormatImplicits

Inherited from SparkFunctions

Inherited from DataHelpers

Inherited from Component

Inherited from UDFUtils

Inherited from Serializable

Inherited from Serializable

Inherited from RestAPIUtils

Inherited from LazyLogging

Inherited from ProphecyDataFrame

Inherited from AnyRef

Inherited from Any

Ungrouped