case class Pipeline(name: String, pipelineProperties: PipelineProperties, graph: DirectedAcyclicGraph, objectProperties: Map[String, Properties]) extends Product with Serializable
The Pipeline holds all the data and logic to execute a CodeFeedr job. It stores all stages (Flink jobs) and connects them by setting up buffers (like Kafka).
- name
The name of the pipeline.
- graph
The graph of stages (nodes) and edges (buffers).
- objectProperties
the properties of each stage.
- Alphabetic
- By Inheritance
- Pipeline
- Serializable
- Serializable
- Product
- Equals
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
Pipeline(name: String, pipelineProperties: PipelineProperties, graph: DirectedAcyclicGraph, objectProperties: Map[String, Properties])
- name
The name of the pipeline.
- graph
The graph of stages (nodes) and edges (buffers).
- objectProperties
the properties of each stage.
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
var
_environment: StreamExecutionEnvironment
The mutable StreamExecutionEnvironment.
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
bufferProperties: Properties
Auxiliary method to retrieve buffer properties.
-
def
bufferType: BufferType
Auxiliary method to retrieve buffer type.
-
def
clone(): AnyRef
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )
-
val
environment: StreamExecutionEnvironment
Immutable StreamExecutionEnvironment.
Immutable StreamExecutionEnvironment.
By default the TimeCharacteristic is set to EvenTime.
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
finalize(): Unit
- Attributes
- protected[java.lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- val graph: DirectedAcyclicGraph
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
keyManager: KeyManager
Auxiliary method to retrieve key manager.
- var name: String
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- val objectProperties: Map[String, Properties]
- val pipelineProperties: PipelineProperties
-
def
prepare(): Unit
Prepare the pipeline by adding this instance to every stage.
-
def
propertiesOf[U <: Serializable, V <: Serializable](stage: Stage[U, V]): Properties
Get the properties of a stage.
Get the properties of a stage.
- stage
Stage The stage to retrieve properties from.
- returns
Properties The obtained properties.
-
def
showList(asException: Boolean): Unit
Shows a list of stages inside the pipeline.
Shows a list of stages inside the pipeline. Option to throw an exception to get the data through Flink.
- asException
Throws the list as an exception.
-
def
start(runtime: RuntimeType, stage: String = null, groupId: String = null): Unit
Start the pipeline with a run configuration.
Start the pipeline with a run configuration.
- runtime
Runtime mode (mock, local, clustered).
- stage
Stage to start in case of a clustered run.
- groupId
Group id of the stage (by default set to stage id).
-
def
start(args: Array[String]): Unit
Start the pipeline with a list of command line arguments.
Start the pipeline with a list of command line arguments.
Three types of run-modes are available: - Mock: Flink Jobs are linked without buffer, this requires the pipeline to be sequential. - Local: All Flink jobs are run in one StreamExecutionEnvironment connected with buffers. - Cluster: Flink jobs are run separately and connected with buffers.
- args
Array of command line arguments.
-
def
startClustered(stage: String, groupId: String = null): Unit
Run the pipeline in a clustered manner: run a single stage only.
Run the pipeline in a clustered manner: run a single stage only.
- stage
Stage to run.
- groupId
GroupId to set.
-
def
startLocal(): Unit
Start a locally run pipeline.
Start a locally run pipeline. This mode starts every stage in the same Flink environment but with buffers.
-
def
startMock(): Unit
Run the pipeline as mock.
Run the pipeline as mock. Only works for sequential pipelines.
In a mock run, all stages are put together without buffers and run as a single Flink job.
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
validateUniqueness(): Unit
Validates the uniqueness of the stage IDs, needed for clustered running.
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @throws( ... )