package pipeline
- Alphabetic
- Public
- All
Type Members
-
case class
Context(env: StreamExecutionEnvironment = null, stageId: String, stageProperties: Properties = null, pipeline: Pipeline = null) extends Product with Serializable
The context of this stage.
The context of this stage.
- env
The execution environment it is running in.
- stageId
the name of this stage.
- stageProperties
the properties of this stage.
- pipeline
the pipeline this stage belongs to.
-
final
class
DirectedAcyclicGraph extends AnyRef
A directed acyclic graph.
A directed acyclic graph. Every Pipeline is enforced into a DAG, so that data cannot flow in a loop. Stage's are nodes in this graph whereas edges represent data flow using a org.codefeedr.buffer.Buffer.
This class is immutable so that graph can be build in a functional manner:
val dag = new DirectedAcyclicGraph() .addNode(nodeOne) .addNode(nodeTwo) .addEdge(nodeOne, nodeTwo)
-
final
case class
Edge(from: AnyRef, to: AnyRef) extends Product with Serializable
Links two nodes.
-
final
case class
EmptyPipelineException(message: String = "", cause: Throwable = None.orNull) extends Exception with Product with Serializable
Thrown when a pipeline is empty.
-
final
case class
InvalidPipelineException(message: String = "", cause: Throwable = None.orNull) extends Exception with Product with Serializable
Thrown when a pipeline is built which is invalid.
Thrown when a pipeline is built which is invalid. *
-
final
case class
NoSinkException(message: String = "", cause: Throwable = None.orNull) extends Exception with Product with Serializable
Thrown when a pipeline has no sink.
-
final
case class
NoSourceException(message: String = "", cause: Throwable = None.orNull) extends Exception with Product with Serializable
Thrown when a pipeline has no source.
-
case class
Pipeline(name: String, pipelineProperties: PipelineProperties, graph: DirectedAcyclicGraph, objectProperties: Map[String, Properties]) extends Product with Serializable
The Pipeline holds all the data and logic to execute a CodeFeedr job.
The Pipeline holds all the data and logic to execute a CodeFeedr job. It stores all stages (Flink jobs) and connects them by setting up buffers (like Kafka).
- name
The name of the pipeline.
- graph
The graph of stages (nodes) and edges (buffers).
- objectProperties
the properties of each stage.
-
class
PipelineBuilder extends Logging
Mutable class to build a Pipeline.
Mutable class to build a Pipeline. A pipeline represents a set of stages interconnected with a org.codefeedr.buffer.Buffer. This pipeline gets translated into a DirectedAcyclicGraph.
This builder allows for setting the following properties: - Name of the pipeline. - Type of buffer (e.g. Kafka) and properties for this buffer. - Type of pipeline: sequential or DAG (nonsequential). - Properties for all the stages. - KeyManager for all the stages.
-
final
case class
PipelineListException(json: String) extends Exception with Product with Serializable
Thrown to list all stages in a pipelines.
Thrown to list all stages in a pipelines. It's not really an exception, but necessary since Flink has no graceful shutdown.
-
case class
PipelineProperties(bufferType: BufferType, bufferProperties: Properties, keyManager: KeyManager, streamTimeCharacteristic: TimeCharacteristic, restartStrategy: RestartStrategyConfiguration, checkpointing: Option[Long], checkpointingMode: CheckpointingMode, stateBackend: StateBackend) extends Product with Serializable
Properties of a pipeline are stored in this case class.
Properties of a pipeline are stored in this case class.
- bufferType
The type of org.codefeedr.buffer.Buffer (e.g. Kafka).
- bufferProperties
The properties of the Buffer.
- keyManager
The key manager which provide API call management at stage-level.
- streamTimeCharacteristic
The TimeCharacteristic of the whole pipeline. Event, Ingestion or Processing.
- restartStrategy
The RestartStrategy of the whole pipeline.
- checkpointing
Captures if checkpointing is enabled and if so, what the interval is.
-
abstract
class
Stage[In <: Serializable, Out <: Serializable] extends AnyRef
This class represents a stage within a pipeline.
This class represents a stage within a pipeline. I.e. a node in the graph.
- In
Input type for this stage.
- Out
Output type for this stage.
- Attributes
- protected[org.codefeedr]
-
abstract
class
Stage2[In <: Serializable, In2 <: Serializable, Out <: Serializable] extends Stage[In, Out]
A stage with 2 sources and 1 output.
A stage with 2 sources and 1 output.
- In
Input type for this stage.
- In2
Second input type for this stage.
- Out
Output type for this stage.
-
abstract
class
Stage3[In <: Serializable, In2 <: Serializable, In3 <: Serializable, Out <: Serializable] extends Stage2[In, In2, Out]
A stage with 3 sources and 1 output
A stage with 3 sources and 1 output
- In
Input type for this stage.
- In2
Second input type for this stage.
- In3
Third input type of this stage.
- Out
Output type for this stage.
-
abstract
class
Stage4[In <: Serializable, In2 <: Serializable, In3 <: Serializable, In4 <: Serializable, Out <: Serializable] extends Stage3[In, In2, In3, Out]
A stage with 4 sources and 1 output.
A stage with 4 sources and 1 output.
- In
Input type for this stage.
- In2
Second input type for this stage.
- In3
Third input type of this stage.
- In4
Fourth input type of this stage.
- Out
Output type for this stage.
-
final
case class
StageIdsNotUniqueException(stage: String) extends Exception with Product with Serializable
Thrown when there are conflicting/duplicate stage ids.
-
class
StageList extends Serializable
Holds a list of Stage objects.
-
final
case class
StageNotFoundException(message: String = "", cause: Throwable = None.orNull) extends Exception with Product with Serializable
Thrown when a stage is not found.
-
final
case class
StageTypesIncompatibleException(message: String = "", cause: Throwable = None.orNull) extends Exception with Product with Serializable
Thrown when two connected stages have incompatible types.
Thrown when two connected stages have incompatible types. *
Value Members
-
object
PipelineType extends Enumeration
Types for different pipelines.
Types for different pipelines.
Two types of pipelines are available: - Sequential: As the name suggests, only contains stages in linear order. - DAG: Pipeline in the form of a DirectedAcyclicGraph.
-
object
RuntimeType extends Enumeration
Types for different run-times.
Types for different run-times.
Three types of run-modes are available: - Mock: Flink Jobs are linked without buffer, this requires the pipeline to be sequential. - Local: All Flink jobs are run in one org.apache.flink.streaming.api.scala.StreamExecutionEnvironment connected with buffers. - Cluster: Flink jobs are run separately and connected with buffers.