The default implementation of SparkStreamingRuntime.
Filters the stream based on a combination of boolean conditions against fields.
Invokes the embedded flow on each batch.
Performs join of the two data streams.
Creates a text stream from Apache Kafka.
Models state as a java Iterable[Map[String, Any]].
Models state as a java Map[String, Any].
Starting point for MVEL-based implementations of the update state function.
Supplies helper functions for pair streams.
Creates a stream from a static data set, passing one RDD at a time.
Sets or drops the ignition runtime variables.
An extension of com.ignition.frame.SparkRuntime which adds a streaming context to the mix.
Applies Spark updateStateByKey function to produce a stream of states.
Stream Flow represents a DStream workflow.
Base trait for all stream flow events.
Listener which will be notified on stream flow events.
Workflow step that emits DataStream as the output.
Encapsulates the details about the processed batch.
Listener which will be notified on each processed stream batch.
Sliding time window.
Filter companion object.
Transform companion object.
Join companion object.
Kafka Input companion object.
The entry point for starting ignition stream flows.
MVEL update state companion object.
Queue Input companion object.
SetVariables companion object.
StreamFlow companion object.
Creates StreamStep instances from Xml and Json.
Provides SubFlow common methods.
Sliding window companion object.
Converts this RDD into a DataFrame using the schema of the first row, then applies the DataFrame transformation function and returns the resulting RDD.
Transforms this data stream using asDF function for each RDD.
Data types, implicits, aliases for DStream-based workflows.