case class Pipeline(name: String, pipelineProperties: PipelineProperties, graph: DirectedAcyclicGraph, objectProperties: Map[String, Properties]) extends Product with Serializable

The Pipeline holds all the data and logic to execute a CodeFeedr job. It stores all stages (Flink jobs) and connects them by setting up buffers (like Kafka).

name

The name of the pipeline.

graph

The graph of stages (nodes) and edges (buffers).

objectProperties

the properties of each stage.

Linear Supertypes
Serializable, Serializable, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Pipeline
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Pipeline(name: String, pipelineProperties: PipelineProperties, graph: DirectedAcyclicGraph, objectProperties: Map[String, Properties])

    name

    The name of the pipeline.

    graph

    The graph of stages (nodes) and edges (buffers).

    objectProperties

    the properties of each stage.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. var _environment: StreamExecutionEnvironment

    The mutable StreamExecutionEnvironment.

  5. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  6. def bufferProperties: Properties

    Auxiliary method to retrieve buffer properties.

  7. def bufferType: BufferType

    Auxiliary method to retrieve buffer type.

  8. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  9. val environment: StreamExecutionEnvironment

    Immutable StreamExecutionEnvironment.

    Immutable StreamExecutionEnvironment.

    By default the TimeCharacteristic is set to EvenTime.

  10. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  11. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. val graph: DirectedAcyclicGraph
  14. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  15. def keyManager: KeyManager

    Auxiliary method to retrieve key manager.

  16. var name: String
  17. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  18. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  19. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  20. val objectProperties: Map[String, Properties]
  21. val pipelineProperties: PipelineProperties
  22. def prepare(): Unit

    Prepare the pipeline by adding this instance to every stage.

  23. def propertiesOf[U <: Serializable, V <: Serializable](stage: Stage[U, V]): Properties

    Get the properties of a stage.

    Get the properties of a stage.

    stage

    Stage The stage to retrieve properties from.

    returns

    Properties The obtained properties.

  24. def showList(asException: Boolean): Unit

    Shows a list of stages inside the pipeline.

    Shows a list of stages inside the pipeline. Option to throw an exception to get the data through Flink.

    asException

    Throws the list as an exception.

  25. def start(runtime: RuntimeType, stage: String = null, groupId: String = null): Unit

    Start the pipeline with a run configuration.

    Start the pipeline with a run configuration.

    runtime

    Runtime mode (mock, local, clustered).

    stage

    Stage to start in case of a clustered run.

    groupId

    Group id of the stage (by default set to stage id).

  26. def start(args: Array[String]): Unit

    Start the pipeline with a list of command line arguments.

    Start the pipeline with a list of command line arguments.

    Three types of run-modes are available: - Mock: Flink Jobs are linked without buffer, this requires the pipeline to be sequential. - Local: All Flink jobs are run in one StreamExecutionEnvironment connected with buffers. - Cluster: Flink jobs are run separately and connected with buffers.

    args

    Array of command line arguments.

  27. def startClustered(stage: String, groupId: String = null): Unit

    Run the pipeline in a clustered manner: run a single stage only.

    Run the pipeline in a clustered manner: run a single stage only.

    stage

    Stage to run.

    groupId

    GroupId to set.

  28. def startLocal(): Unit

    Start a locally run pipeline.

    Start a locally run pipeline. This mode starts every stage in the same Flink environment but with buffers.

  29. def startMock(): Unit

    Run the pipeline as mock.

    Run the pipeline as mock. Only works for sequential pipelines.

    In a mock run, all stages are put together without buffers and run as a single Flink job.

  30. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  31. def validateUniqueness(): Unit

    Validates the uniqueness of the stage IDs, needed for clustered running.

  32. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  34. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped