flink-scala-api
Packages
acceptPartialFunctions extends the original DataStream with methods with unique names that delegate to core
higher-order functions (e.g. map
) so that we can work around the fact that overloaded methods taking functions as
parameters can't accept partial functions as well. This enables the possibility to directly apply pattern matching
to decompose inputs such as tuples, case classes and collections.
acceptPartialFunctions extends the original DataStream with methods with unique names that delegate to core
higher-order functions (e.g. map
) so that we can work around the fact that overloaded methods taking functions as
parameters can't accept partial functions as well. This enables the possibility to directly apply pattern matching
to decompose inputs such as tuples, case classes and collections.
The following is a small example that showcases how this extensions would work on a Flink data stream:
object Main {
import org.apache.flink.streaming.api.scala.extensions._
case class Point(x: Double, y: Double)
def main(args: Array[String]): Unit = {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val ds = env.fromElements(Point(1, 2), Point(3, 4), Point(5, 6))
ds.filterWith {
case Point(x, _) => x > 1
}.reduceWith {
case (Point(x1, y1), (Point(x2, y2))) => Point(x1 + y1, x2 + y2)
}.mapWith {
case Point(x, y) => (x, y)
}.flatMapWith {
case (x, y) => Seq('x' -> x, 'y' -> y)
}.keyingBy {
case (id, value) => id
}
}
}
The extension consists of several implicit conversions over all the data stream representations that could gain from
this feature. To use this set of extensions methods the user has to explicitly opt-in by importing
org.apache.flink.streaming.api.scala.extensions.acceptPartialFunctions
.
For more information and usage examples please consult the Apache Flink official documentation.