Provide context combinators to complete job definition, that can take some another arguments + spark computational context.
Provide context combinators to complete job definition, that can take some another arguments + spark computational context.
Available contexts:
There are two ways how to define job using that standard combinators:
For job which doesn't require any external argument except context use one of bellow functions:
In case when you have arguments you can call that functions on them
withArgs(arg[Int]("x") & arg[String]("str")).onSparkContext( (x: Int, str: String, sc: SparkContext) => { ... })
Access to mist-specific job parameters + logger
Get access to mist-extras in job definition Example:
Get access to mist-extras in job definition Example:
withMistExtras.onSparkContext((extras: MistExtras, sc: SparkContext) => { val jobId = extras.jobId extras.logger.info(s"Hello from my job $jobId") })
Scala api - root class for jobs definition
Scala api - root class for jobs definition
Example:
import mist.api._ import mist.api.DefaultEncoders._ import org.apache.spark.SparkContext object MyJob extends MistJob[Array[Int]] { override def handle = { withArgs(arg[Int]("number").onSparkContext((i: Int, sc: SparkContext) => { sc.parallelize(1 to i).map(_ * 2).collect() }) } }
Arguments for constructing contexts