Class

com.ebiznext.comet.job.metrics

AssertionJob

Related Doc: package metrics

Permalink

class AssertionJob extends SparkJob

Linear Supertypes
SparkJob, JobBase, StrictLogging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AssertionJob
  2. SparkJob
  3. JobBase
  4. StrictLogging
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new AssertionJob(domainName: String, schemaName: String, assertions: Map[String, String], stage: Stage, storageHandler: StorageHandler, schemaHandler: SchemaHandler, dataset: Option[DataFrame], engine: Engine, sqlRunner: (String) ⇒ Long)(implicit settings: Settings)

    Permalink

    stage

    : stage

    storageHandler

    : Storage Handler

Type Members

  1. type JdbcConfigName = String

    Permalink
    Definition Classes
    JobBase

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. def analyze(fullTableName: String): Any

    Permalink
    Attributes
    protected
    Definition Classes
    SparkJob
  5. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. def createSparkViews(views: Views, sqlParameters: Map[String, String]): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    SparkJob
  8. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  9. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  12. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  14. def lockPath(path: String): Path

    Permalink
  15. val logger: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    StrictLogging
  16. def name: String

    Permalink
    Definition Classes
    AssertionJobJobBase
  17. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  18. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  20. def parseViewDefinition(valueWithEnv: String): (SinkType, Option[JdbcConfigName], String)

    Permalink

    valueWithEnv

    in the form [SinkType:[configName:]]viewName

    returns

    (SinkType, configName, viewName)

    Attributes
    protected
    Definition Classes
    JobBase
  21. def partitionDataset(dataset: DataFrame, partition: List[String]): DataFrame

    Permalink
    Attributes
    protected
    Definition Classes
    SparkJob
  22. def partitionedDatasetWriter(dataset: DataFrame, partition: List[String]): DataFrameWriter[Row]

    Permalink

    Partition a dataset using dataset columns.

    Partition a dataset using dataset columns. To partition the dataset using the ingestion time, use the reserved column names :

    • comet_date
    • comet_year
    • comet_month
    • comet_day
    • comet_hour
    • comet_minute These columns are renamed to "date", "year", "month", "day", "hour", "minute" in the dataset and their values is set to the current date/time.
    dataset

    : Input dataset

    partition

    : list of columns to use for partitioning.

    returns

    The Spark session used to run this job

    Attributes
    protected
    Definition Classes
    SparkJob
  23. def registerUdf(udf: String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    SparkJob
  24. def run(): Try[JobResult]

    Permalink

    Just to force any job to implement its entry point using within the "run" method

    Just to force any job to implement its entry point using within the "run" method

    returns

    : Spark Dataframe for Spark Jobs None otherwise

    Definition Classes
    AssertionJobJobBase
  25. lazy val session: SparkSession

    Permalink
    Definition Classes
    SparkJob
  26. implicit val settings: Settings

    Permalink
    Definition Classes
    AssertionJobJobBase
  27. lazy val sparkEnv: SparkEnv

    Permalink
    Definition Classes
    SparkJob
  28. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  29. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  30. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from SparkJob

Inherited from JobBase

Inherited from StrictLogging

Inherited from AnyRef

Inherited from Any

Ungrouped