DPJob

gcp4zio.dp.DPJob
See theDPJob companion trait
object DPJob

Attributes

Companion
trait
Graph
Supertypes
class Object
trait Matchable
class Any
Self type
DPJob.type

Members list

Value members

Concrete methods

def executeHiveJob(query: String, trackingInterval: Duration): RIO[DPJob, Job]

Submits a Hive Job in Dataproc Cluster and waits till Job Completion

Submits a Hive Job in Dataproc Cluster and waits till Job Completion

Value parameters

query

Hive SQL query to run

trackingInterval

Specifies duration each status check repetition should be spaced from the last run

Attributes

def executeSparkJob(args: List[String], mainClass: String, libs: List[String], conf: Map[String, String], trackingInterval: Duration): RIO[DPJob, Job]

Submits a Spark Job in Dataproc Cluster and waits till Job Completion

Submits a Spark Job in Dataproc Cluster and waits till Job Completion

Value parameters

args

command line arguments which will be passed to spark application

conf

Key value pair of the spark properties

libs

List of jar required to run this Spark Application (including application jar)

mainClass

Main class to run

trackingInterval

Specifies duration each status check repetition should be spaced from the last run

Attributes

def live(cluster: String, project: String, region: String, endpoint: String): TaskLayer[DPJob]

Creates live layer required for all DPJob API's

Creates live layer required for all DPJob API's

Value parameters

cluster

Dataproc cluster name

endpoint

GCP dataproc API

project

GCP projectID

region

GCP Region name

Attributes

def submitHiveJob(query: String): RIO[DPJob, Job]

Submits a Hive Job in Dataproc Cluster. (This API will not wait for Job Completion, to wait for Job completion use trackJobProgress)

Submits a Hive Job in Dataproc Cluster. (This API will not wait for Job Completion, to wait for Job completion use trackJobProgress)

Value parameters

query

Hive SQL query to run

Attributes

def submitSparkJob(args: List[String], mainClass: String, libs: List[String], conf: Map[String, String]): RIO[DPJob, Job]

Submits a Spark Job in Dataproc Cluster. (This API will not wait for Job Completion, to wait for Job completion use trackJobProgress)

Submits a Spark Job in Dataproc Cluster. (This API will not wait for Job Completion, to wait for Job completion use trackJobProgress)

Value parameters

args

command line arguments which will be passed to spark application

conf

Key value pair of the spark properties

libs

List of jar required to run this Spark Application (including application jar)

mainClass

Main class to run

Attributes

def trackJobProgress(job: Job, interval: Duration): RIO[DPJob, Unit]

This API will track job until Completion

This API will track job until Completion

Value parameters

interval

Specifies duration each status check repetition should be spaced from the last run

job

Dataproc Job which needs to be tracked

Attributes