Package

com.ebiznext.comet.job

transform

Permalink

package transform

Visibility
  1. Public
  2. All

Type Members

  1. case class AutoTaskJob(name: String, defaultArea: StorageArea, format: Option[String], coalesce: Boolean, udf: Option[String], views: Views, engine: Engine, task: AutoTaskDesc, sqlParameters: Map[String, String])(implicit settings: Settings, storageHandler: StorageHandler, schemaHandler: SchemaHandler) extends SparkJob with Product with Serializable

    Permalink

    Execute the SQL Task and store it in parquet/orc/....

    Execute the SQL Task and store it in parquet/orc/.... If Hive support is enabled, also store it as a Hive Table. If analyze support is active, also compute basic statistics for twhe dataset.

    name

    : Job Name as defined in the YML job description file

    defaultArea

    : Where the resulting dataset is stored by default if not specified in the task

    task

    : Task to run

    sqlParameters

    : Sql Parameters to pass to SQL statements

Ungrouped