com.krux.hyperion.activity

SparkTaskActivity

case class SparkTaskActivity extends EmrTaskActivity[SparkCluster] with Product with Serializable

Runs a Spark job on a cluster. The cluster can be an EMR cluster managed by AWS Data Pipeline or another resource if you use TaskRunner. Use SparkActivity when you want to run work in parallel. This allows you to use the scheduling resources of the YARN framework or the MapReduce resource negotiator in Hadoop 1. If you would like to run work sequentially using the Amazon EMR Step action, you can still use SparkActivity.

Source
SparkTaskActivity.scala
Linear Supertypes
Serializable, Serializable, Product, Equals, EmrTaskActivity[SparkCluster], EmrActivity[SparkCluster], PipelineActivity[SparkCluster], NamedPipelineObject, PipelineObject, Ordered[PipelineObject], Comparable[PipelineObject], AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. SparkTaskActivity
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. EmrTaskActivity
  7. EmrActivity
  8. PipelineActivity
  9. NamedPipelineObject
  10. PipelineObject
  11. Ordered
  12. Comparable
  13. AnyRef
  14. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. def <(that: PipelineObject): Boolean

    Definition Classes
    Ordered
  5. def <=(that: PipelineObject): Boolean

    Definition Classes
    Ordered
  6. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  7. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  8. def >(that: PipelineObject): Boolean

    Definition Classes
    Ordered
  9. def >=(that: PipelineObject): Boolean

    Definition Classes
    Ordered
  10. val activityFields: ActivityFields[SparkCluster]

    Definition Classes
    SparkTaskActivityPipelineActivity
  11. val arguments: Seq[HString]

  12. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  13. def attemptTimeout: Option[HDuration]

    Definition Classes
    PipelineActivity
  14. val baseFields: BaseFields

    Definition Classes
    SparkTaskActivityNamedPipelineObject
  15. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  16. def compare(that: PipelineObject): Int

    Definition Classes
    PipelineObject → Ordered
  17. def compareTo(that: PipelineObject): Int

    Definition Classes
    Ordered → Comparable
  18. def dependsOn: Seq[PipelineActivity[_]]

    Definition Classes
    PipelineActivity
  19. val emrTaskActivityFields: EmrTaskActivityFields

    Definition Classes
    SparkTaskActivityEmrTaskActivity
  20. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  21. def failureAndRerunMode: Option[FailureAndRerunMode]

    Definition Classes
    PipelineActivity
  22. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  23. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  24. def groupedBy(group: String): Self

    Postfix the name field

    Postfix the name field

    Definition Classes
    NamedPipelineObject
  25. val hadoopQueue: Option[HString]

  26. def id: PipelineObjectId

    Definition Classes
    NamedPipelineObjectPipelineObject
  27. def idGroupedBy(group: String): Self

    Have a grouping postfix in the id field

    Have a grouping postfix in the id field

    Definition Classes
    NamedPipelineObject
    Note

    Id naming is more restrictive, it is recommended to not changing the id unleass you have a good reason

  28. def idNamed(namePrefix: String): Self

    Id field will be prefixed with name

    Id field will be prefixed with name

    Definition Classes
    NamedPipelineObject
    Note

    Id naming is more restrictive, it is recommended to not changing the id unless you have a good reason

  29. val inputs: Seq[S3DataNode]

  30. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  31. val jarUri: HString

  32. val jobRunner: HString

  33. def lateAfterTimeout: Option[HDuration]

    Definition Classes
    PipelineActivity
  34. val mainClass: MainClass

  35. def maximumRetries: Option[HInt]

    Definition Classes
    PipelineActivity
  36. def name: Option[String]

    Name of the pipeline object, if not set, it will defaults to

    Name of the pipeline object, if not set, it will defaults to

    Option(id)
    Definition Classes
    NamedPipelineObject
  37. def named(namePrefix: String): Self

    Give the object a name prefix

    Give the object a name prefix

    Definition Classes
    NamedPipelineObject
  38. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  39. final def notify(): Unit

    Definition Classes
    AnyRef
  40. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  41. def objects: Iterable[PipelineObject]

  42. def onFail(alarms: SnsAlarm*): Self

    Definition Classes
    PipelineActivity
  43. def onFailAlarms: Seq[SnsAlarm]

    Definition Classes
    PipelineActivity
  44. def onLateAction(alarms: SnsAlarm*): Self

    Definition Classes
    PipelineActivity
  45. def onLateActionAlarms: Seq[SnsAlarm]

    Definition Classes
    PipelineActivity
  46. def onSuccess(alarms: SnsAlarm*): Self

    Definition Classes
    PipelineActivity
  47. def onSuccessAlarms: Seq[SnsAlarm]

    Definition Classes
    PipelineActivity
  48. val outputs: Seq[S3DataNode]

  49. def postActivityTaskConfig: Option[ShellScriptConfig]

    Definition Classes
    EmrTaskActivity
  50. def preActivityTaskConfig: Option[ShellScriptConfig]

    Definition Classes
    EmrTaskActivity
  51. def preconditions: Seq[Precondition]

    Definition Classes
    PipelineActivity
  52. def ref: AdpRef[AdpActivity]

    Definition Classes
    PipelineActivityPipelineObject
  53. def retryDelay: Option[HDuration]

    Definition Classes
    PipelineActivity
  54. def runsOn: Resource[SparkCluster]

    Definition Classes
    PipelineActivity
  55. val scriptRunner: HString

  56. implicit def seq2Option[A](anySeq: Seq[A]): Option[Seq[A]]

    Definition Classes
    PipelineObject
  57. def seqToOption[A, B](anySeq: Seq[A])(transform: (A) ⇒ B): Option[Seq[B]]

    Definition Classes
    PipelineObject
  58. lazy val serialize: AdpHadoopActivity

  59. val sparkConfig: Map[HString, HString]

  60. val sparkOptions: Seq[HString]

  61. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  62. implicit def uniquePipelineId2String(id: PipelineObjectId): String

    Definition Classes
    PipelineObject
  63. def updateActivityFields(fields: ActivityFields[SparkCluster]): SparkTaskActivity

    Definition Classes
    SparkTaskActivityPipelineActivity
  64. def updateBaseFields(fields: BaseFields): SparkTaskActivity

    Definition Classes
    SparkTaskActivityNamedPipelineObject
  65. def updateEmrTaskActivityFields(fields: EmrTaskActivityFields): SparkTaskActivity

    Definition Classes
    SparkTaskActivityEmrTaskActivity
  66. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  67. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  68. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  69. def whenMet(conditions: Precondition*): Self

    Definition Classes
    PipelineActivity
  70. def withArguments(args: HString*): SparkTaskActivity

  71. def withAttemptTimeout(duration: HDuration): Self

    Definition Classes
    PipelineActivity
  72. def withDriverCores(n: HInt): SparkTaskActivity

  73. def withDriverMemory(memory: Memory): SparkTaskActivity

  74. def withExecutorCores(n: HInt): SparkTaskActivity

  75. def withExecutorMemory(memory: Memory): SparkTaskActivity

  76. def withFailureAndRerunMode(mode: FailureAndRerunMode): Self

    Definition Classes
    PipelineActivity
  77. def withFiles(files: HString*): SparkTaskActivity

  78. def withHadoopQueue(queue: HString): SparkTaskActivity

  79. def withInput(input: S3DataNode*): SparkTaskActivity

  80. def withLateAfterTimeout(duration: HDuration): Self

    Definition Classes
    PipelineActivity
  81. def withMaster(master: HString): SparkTaskActivity

  82. def withMaximumRetries(retries: HInt): Self

    Definition Classes
    PipelineActivity
  83. def withNumExecutors(n: HInt): SparkTaskActivity

  84. def withOutput(output: S3DataNode*): SparkTaskActivity

  85. def withPostActivityTaskConfig(config: ShellScriptConfig): Self

    Definition Classes
    EmrTaskActivity
  86. def withPreActivityTaskConfig(config: ShellScriptConfig): Self

    Definition Classes
    EmrTaskActivity
  87. def withRetryDelay(duration: HDuration): Self

    Definition Classes
    PipelineActivity
  88. def withSparkConfig(key: HString, value: HString): SparkTaskActivity

  89. def withSparkOption(option: HString*): SparkTaskActivity

  90. def withTotalExecutorCores(n: HInt): SparkTaskActivity

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from EmrTaskActivity[SparkCluster]

Inherited from EmrActivity[SparkCluster]

Inherited from PipelineActivity[SparkCluster]

Inherited from NamedPipelineObject

Inherited from PipelineObject

Inherited from Ordered[PipelineObject]

Inherited from Comparable[PipelineObject]

Inherited from AnyRef

Inherited from Any

Ungrouped