Object/Class

reactor.core.scala.publisher

Flux

Related Docs: class Flux | package publisher

Permalink

object Flux

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Flux
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def combineLatest[T, V](sources: Iterable[Publisher[T]], prefetch: Int, combinator: (Array[AnyRef]) ⇒ V): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T

    The common base type of the source sequences

    V

    The produced output after transformation by the given combinator

    sources

    The list of upstream Publisher to subscribe to.

    prefetch

    demand produced to each combined source Publisher

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a Flux based on the produced value

  7. def combineLatest[T, V](sources: Iterable[Publisher[T]], combinator: (Array[AnyRef]) ⇒ V): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T

    The common base type of the source sequences

    V

    The produced output after transformation by the given combinator

    sources

    The list of upstream Publisher to subscribe to.

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a Flux based on the produced value

  8. def combineLatest[T1, T2, T3, T4, T5, T6, V](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], source3: Publisher[_ <: T3], source4: Publisher[_ <: T4], source5: Publisher[_ <: T5], source6: Publisher[_ <: T6], combinator: (Array[AnyRef]) ⇒ V): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T1

    type of the value from source1

    T2

    type of the value from source2

    T3

    type of the value from source3

    T4

    type of the value from source4

    T5

    type of the value from source5

    V

    The produced output after transformation by the given combinator

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    source3

    The third upstream Publisher to subscribe to.

    source4

    The fourth upstream Publisher to subscribe to.

    source5

    The fifth upstream Publisher to subscribe to.

    source6

    The sixth upstream Publisher to subscribe to.

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a Flux based on the produced value

  9. def combineLatest[T1, T2, T3, T4, T5, V](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], source3: Publisher[_ <: T3], source4: Publisher[_ <: T4], source5: Publisher[_ <: T5], combinator: (Array[AnyRef]) ⇒ V): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T1

    type of the value from source1

    T2

    type of the value from source2

    T3

    type of the value from source3

    T4

    type of the value from source4

    T5

    type of the value from source5

    V

    The produced output after transformation by the given combinator

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    source3

    The third upstream Publisher to subscribe to.

    source4

    The fourth upstream Publisher to subscribe to.

    source5

    The fifth upstream Publisher to subscribe to.

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a Flux based on the produced value

  10. def combineLatest[T1, T2, T3, T4, V](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], source3: Publisher[_ <: T3], source4: Publisher[_ <: T4], combinator: (Array[AnyRef]) ⇒ V): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T1

    type of the value from source1

    T2

    type of the value from source2

    T3

    type of the value from source3

    T4

    type of the value from source4

    V

    The produced output after transformation by the given combinator

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    source3

    The third upstream Publisher to subscribe to.

    source4

    The fourth upstream Publisher to subscribe to.

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a Flux based on the produced value

  11. def combineLatest[T1, T2, T3, V](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], source3: Publisher[_ <: T3], combinator: (Array[AnyRef]) ⇒ V): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T1

    type of the value from source1

    T2

    type of the value from source2

    T3

    type of the value from source3

    V

    The produced output after transformation by the given combinator

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    source3

    The third upstream Publisher to subscribe to.

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a Flux based on the produced value

  12. def combineLatest[T1, T2, V](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], combinator: (T1, T2) ⇒ V): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T1

    type of the value from source1

    T2

    type of the value from source2

    V

    The produced output after transformation by the given combinator

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a Flux based on the produced value

  13. def combineLatest[T, V](combinator: (Array[AnyRef]) ⇒ V, prefetch: Int, sources: Publisher[_ <: T]*): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T

    type of the value from sources

    V

    The produced output after transformation by the given combinator

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    prefetch

    demand produced to each combined source Publisher

    sources

    The upstreams Publisher to subscribe to.

    returns

    a Flux based on the produced combinations

  14. def combineLatest[T, V](combinator: (Array[AnyRef]) ⇒ V, sources: Publisher[_ <: T]*): Flux[V]

    Permalink

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    Build a Flux whose data are generated by the combination of the most recent published values from all publishers.

    T

    type of the value from sources

    V

    The produced output after transformation by the given combinator

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    sources

    The upstreams Publisher to subscribe to.

    returns

    a Flux based on the produced combinations

  15. def concat[T](sources: Publisher[T]*): Flux[T]

    Permalink

    Concat all sources pulled from the given Publisher array.

    Concat all sources pulled from the given Publisher array. A complete signal from each source will delimit the individual sequences and will be eventually passed to the returned Publisher.

    T

    The source type of the data sequence

    sources

    The array of Publisher to concat

    returns

    a new Flux concatenating all source sequences

  16. def concat[T](sources: Publisher[Publisher[T]], prefetch: Int): Flux[T]

    Permalink

    Concat all sources emitted as an onNext signal from a parent Publisher.

    Concat all sources emitted as an onNext signal from a parent Publisher. A complete signal from each source will delimit the individual sequences and will be eventually passed to the returned Publisher which will stop listening if the main sequence has also completed.

    T

    The source type of the data sequence

    sources

    The Publisher of Publisher to concat

    prefetch

    the inner source request size

    returns

    a new Flux concatenating all inner sources sequences until complete or error

  17. def concat[T](sources: Publisher[Publisher[T]]): Flux[T]

    Permalink

    Concat all sources emitted as an onNext signal from a parent Publisher.

    Concat all sources emitted as an onNext signal from a parent Publisher. A complete signal from each source will delimit the individual sequences and will be eventually passed to the returned Publisher which will stop listening if the main sequence has also completed.

    T

    The source type of the data sequence

    sources

    The Publisher of Publisher to concat

    returns

    a new Flux concatenating all inner sources sequences until complete or error

  18. def concat[T](sources: Iterable[Publisher[T]]): Flux[T]

    Permalink

    Concat all sources pulled from the supplied Iterator on Publisher.subscribe from the passed Iterable until Iterator.hasNext returns false.

    Concat all sources pulled from the supplied Iterator on Publisher.subscribe from the passed Iterable until Iterator.hasNext returns false. A complete signal from each source will delimit the individual sequences and will be eventually passed to the returned Publisher.

    T

    The source type of the data sequence

    sources

    The Publisher of Publisher to concat

    returns

    a new Flux concatenating all source sequences

  19. def concatDelayError[T](sources: Publisher[T]*): Flux[T]

    Permalink

    Concat all sources pulled from the given Publisher array.

    Concat all sources pulled from the given Publisher array. A complete signal from each source will delimit the individual sequences and will be eventually passed to the returned Publisher. Any error will be delayed until all sources have been concatenated.

    T

    The source type of the data sequence

    sources

    The Publisher of Publisher to concat

    returns

    a new Flux concatenating all source sequences

  20. def concatDelayError[T](sources: Publisher[Publisher[T]], delayUntilEnd: Boolean, prefetch: Int): Flux[T]

    Permalink

    Concat all sources emitted as an onNext signal from a parent Publisher.

    Concat all sources emitted as an onNext signal from a parent Publisher. A complete signal from each source will delimit the individual sequences and will be eventually passed to the returned Publisher which will stop listening if the main sequence has also completed.

    Errors will be delayed after the current concat backlog if delayUntilEnd is false or after all sources if delayUntilEnd is true.

    T

    The source type of the data sequence

    sources

    The Publisher of Publisher to concat

    delayUntilEnd

    delay error until all sources have been consumed instead of after the current source

    prefetch

    the inner source request size

    returns

    a new Flux concatenating all inner sources sequences until complete or error

  21. def concatDelayError[T](sources: Publisher[Publisher[T]], prefetch: Int): Flux[T]

    Permalink

    Concat all sources emitted as an onNext signal from a parent Publisher.

    Concat all sources emitted as an onNext signal from a parent Publisher. A complete signal from each source will delimit the individual sequences and will be eventually passed to the returned Publisher which will stop listening if the main sequence has also completed.

    T

    The source type of the data sequence

    sources

    The Publisher of Publisher to concat

    prefetch

    the inner source request size

    returns

    a new Flux concatenating all inner sources sequences until complete or error

  22. def concatDelayError[T](sources: Publisher[Publisher[T]]): Flux[T]

    Permalink

    Concat all sources emitted as an onNext signal from a parent Publisher.

    Concat all sources emitted as an onNext signal from a parent Publisher. A complete signal from each source will delimit the individual sequences and will be eventually passed to the returned Publisher which will stop listening if the main sequence has also completed.

    T

    The source type of the data sequence

    sources

    The Publisher of Publisher to concat

    returns

    a new Flux concatenating all inner sources sequences until complete or error

  23. def create[T](emitter: (FluxSink[T]) ⇒ Unit, backpressure: OverflowStrategy): Flux[T]

    Permalink

    Creates a Flux with multi-emission capabilities (synchronous or asynchronous) through the FluxSink API.

    Creates a Flux with multi-emission capabilities (synchronous or asynchronous) through the FluxSink API.

    This Flux factory is useful if one wants to adapt some other a multi-valued async API and not worry about cancellation and backpressure. For example:

    
    Flux.create[String](emitter => {
    
    ActionListener al = e => {
            emitter.next(textField.getText())
    }
    // without cancellation support:
    
        button.addActionListener(al)
    
    // with cancellation support:
    
        button.addActionListener(al)
        emitter.setCancellation(() => {
            button.removeListener(al)
    })
    }, FluxSink.OverflowStrategy.LATEST);
    
    

    T

    the value type

    emitter

    the consumer that will receive a FluxSink for each individual Subscriber.

    backpressure

    the backpressure mode, see { @link OverflowStrategy} for the available backpressure modes

    returns

    a Flux

  24. def create[T](emitter: (FluxSink[T]) ⇒ Unit): Flux[T]

    Permalink

    Creates a Flux with multi-emission capabilities (synchronous or asynchronous) through the FluxSink API.

    Creates a Flux with multi-emission capabilities (synchronous or asynchronous) through the FluxSink API.

    This Flux factory is useful if one wants to adapt some other a multi-valued async API and not worry about cancellation and backpressure. For example:

    Handles backpressure by buffering all signals if the downstream can't keep up.

    
    Flux.String>create(emitter -> {
    
    ActionListener al = e -> {
            emitter.next(textField.getText());
    };
    // without cancellation support:
    
        button.addActionListener(al);
    
    // with cancellation support:
    
        button.addActionListener(al);
        emitter.setCancellation(() -> {
            button.removeListener(al);
    });
    });
    
    

    T

    the value type

    emitter

    the consumer that will receive a FluxSink for each individual Subscriber.

    returns

    a Flux

  25. def defer[T](supplier: () ⇒ Publisher[T]): Flux[T]

    Permalink

    Supply a Publisher everytime subscribe is called on the returned flux.

    Supply a Publisher everytime subscribe is called on the returned flux. The passed scala.Function1[Unit,Publisher[T]] will be invoked and it's up to the developer to choose to return a new instance of a Publisher or reuse one effectively behaving like Flux.from

    T

    the type of values passing through the Flux

    supplier

    the Publisher Supplier to call on subscribe

    returns

    a deferred Flux

  26. def empty[T]: Flux[T]

    Permalink

    Create a Flux that completes without emitting any item.

    Create a Flux that completes without emitting any item.

    T

    the reified type of the target Subscriber

    returns

    an empty Flux

  27. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  28. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  29. def error[O](throwable: Throwable, whenRequested: Boolean): Flux[O]

    Permalink

    Build a Flux that will only emit an error signal to any new subscriber.

    Build a Flux that will only emit an error signal to any new subscriber.

    O

    the output type

    throwable

    the error to signal to each Subscriber

    whenRequested

    if true, will onError on the first request instead of subscribe().

    returns

    a new failed Flux

  30. def error[T](error: Throwable): Flux[T]

    Permalink

    Create a Flux that completes with the specified error.

    Create a Flux that completes with the specified error.

    T

    the reified type of the target Subscriber

    error

    the error to signal to each Subscriber

    returns

    a new failed Flux

  31. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  32. def firstEmitting[I](sources: Iterable[Publisher[_ <: I]]): Flux[I]

    Permalink

    Select the fastest source who won the "ambiguous" race and emitted first onNext or onComplete or onError

    Select the fastest source who won the "ambiguous" race and emitted first onNext or onComplete or onError

    I

    The source type of the data sequence

    sources

    The competing source publishers

    returns

    a new Flux} eventually subscribed to one of the sources or empty

  33. def firstEmitting[I](sources: Publisher[_ <: I]*): Flux[I]

    Permalink

    Select the fastest source who emitted first onNext or onComplete or onError

    Select the fastest source who emitted first onNext or onComplete or onError

    I

    The source type of the data sequence

    sources

    The competing source publishers

    returns

    a new Flux} eventually subscribed to one of the sources or empty

  34. def from[T](source: Publisher[_ <: T]): Flux[T]

    Permalink

    Expose the specified Publisher with the Flux API.

    Expose the specified Publisher with the Flux API.

    T

    the source sequence type

    source

    the source to decorate

    returns

    a new Flux

  35. def fromArray[T <: AnyRef](array: Array[T]): Flux[T]

    Permalink

    Create a Flux that emits the items contained in the provided scala.Array.

    Create a Flux that emits the items contained in the provided scala.Array.

    T

    the Publisher type to stream

    array

    the array to read data from

    returns

    a new Flux

  36. def fromIterable[T](it: Iterable[T]): Flux[T]

    Permalink

    Create a Flux that emits the items contained in the provided Iterable.

    Create a Flux that emits the items contained in the provided Iterable. A new iterator will be created for each subscriber.

    T

    the Iterable type to stream

    it

    the Iterable to read data from

    returns

    a new Flux

  37. def fromStream[T](s: Stream[T]): Flux[T]

    Permalink

    Create a Flux that emits the items contained in the provided Stream.

    Create a Flux that emits the items contained in the provided Stream. A new iterator will be created for each subscriber.

    T

    the Stream type to flux

    s

    the Stream to read data from

    returns

    a new Flux

  38. def generate[T, S](stateSupplier: Option[Callable[S]], generator: (S, SynchronousSink[T]) ⇒ S, stateConsumer: (Option[S]) ⇒ Unit): Flux[T]

    Permalink

    Generate signals one-by-one via a function callback.

    Generate signals one-by-one via a function callback.

    T

    the value type emitted

    S

    the custom state per subscriber

    stateSupplier

    called for each incoming Supplier to provide the initial state for the generator bifunction

    generator

    the bifunction called with the current state, the SynchronousSink API instance and is expected to return a (new) state.

    stateConsumer

    called after the generator has terminated or the downstream cancelled, receiving the last state to be handled (i.e., release resources or do other cleanup).

    returns

    a Reactive Flux publisher ready to be subscribed

  39. def generate[T, S](stateSupplier: Option[Callable[S]], generator: (S, SynchronousSink[T]) ⇒ S): Flux[T]

    Permalink

    Generate signals one-by-one via a function callback.

    Generate signals one-by-one via a function callback.

    T

    the value type emitted

    S

    the custom state per subscriber

    stateSupplier

    called for each incoming Supplier to provide the initial state for the generator bifunction

    generator

    the bifunction called with the current state, the SynchronousSink API instance and is expected to return a (new) state.

    returns

    a Reactive Flux publisher ready to be subscribed

  40. def generate[T](generator: (SynchronousSink[T]) ⇒ Unit): Flux[T]

    Permalink

    Generate signals one-by-one via a consumer callback.

    Generate signals one-by-one via a consumer callback.

    T

    the value type emitted

    generator

    the consumer called with the SynchronousSink API instance

    returns

    a Reactive Flux publisher ready to be subscribed

  41. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  42. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  43. def interval(delay: Duration, period: Duration, timer: Scheduler): Flux[Long]

    Permalink

    Create a new Flux that emits an ever incrementing long starting with 0 every N period of time unit on the given timer.

    Create a new Flux that emits an ever incrementing long starting with 0 every N period of time unit on the given timer. If demand is not produced in time, an onError will be signalled. The Flux will never complete.

    delay

    the timespan in milliseconds to wait before emitting 0l

    period

    the period in milliseconds before each following increment

    timer

    the Scheduler to schedule on

    returns

    a new timed Flux

  44. def interval(period: Duration, timer: Scheduler): Flux[Long]

    Permalink

    Create a new Flux that emits an ever incrementing long starting with 0 every N milliseconds on the given timer.

    Create a new Flux that emits an ever incrementing long starting with 0 every N milliseconds on the given timer. If demand is not produced in time, an onError will be signalled. The Flux will never complete.

    period

    The duration in milliseconds to wait before the next increment

    timer

    a Scheduler instance

    returns

    a new timed Flux

  45. def interval(delay: Duration, period: Duration): Flux[Long]

    Permalink

    Create a new Flux that emits an ever incrementing long starting with 0 every N period of time unit on a global timer.

    Create a new Flux that emits an ever incrementing long starting with 0 every N period of time unit on a global timer. If demand is not produced in time, an onError will be signalled. The Flux will never complete.

    delay

    the delay to wait before emitting 0l

    period

    the period before each following increment

    returns

    a new timed Flux

  46. def interval(period: Duration): Flux[Long]

    Permalink

    Create a new Flux that emits an ever incrementing long starting with 0 every period on the global timer.

    Create a new Flux that emits an ever incrementing long starting with 0 every period on the global timer. If demand is not produced in time, an onError will be signalled. The Flux will never complete.

    period

    The duration to wait before the next increment

    returns

    a new timed Flux

  47. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  48. def just[T](firstData: T, data: T*): Flux[T]

    Permalink

    Create a new Flux that emits the specified items and then complete.

    Create a new Flux that emits the specified items and then complete.

    T

    the emitted data type

    firstData

    the first data object to emit

    data

    the consecutive data objects to emit

    returns

    a new Flux

  49. def merge[I](prefetch: Int, sources: Publisher[_ <: I]*): Flux[I]

    Permalink

    Merge emitted Publisher sequences from the passed Publisher array into an interleaved merged sequence.

    Merge emitted Publisher sequences from the passed Publisher array into an interleaved merged sequence.

    I

    The source type of the data sequence

    prefetch

    the inner source request size

    sources

    the Publisher array to iterate on Publisher.subscribe

    returns

    a fresh Reactive Flux publisher ready to be subscribed

  50. def merge[I](sources: Publisher[_ <: I]*): Flux[I]

    Permalink

    Merge emitted Publisher sequences from the passed Publisher array into an interleaved merged sequence.

    Merge emitted Publisher sequences from the passed Publisher array into an interleaved merged sequence.

    I

    The source type of the data sequence

    sources

    the Publisher array to iterate on Publisher.subscribe

    returns

    a fresh Reactive Flux publisher ready to be subscribed

  51. def merge[I](sources: Iterable[Publisher[_ <: I]]): Flux[I]

    Permalink

    Merge emitted Publisher sequences from the passed Publisher into an interleaved merged sequence.

    Merge emitted Publisher sequences from the passed Publisher into an interleaved merged sequence. Iterable.iterator() will be called for each Publisher.subscribe.

    I

    The source type of the data sequence

    sources

    the scala.Iterable to lazily iterate on Publisher.subscribe

    returns

    a fresh Reactive Flux publisher ready to be subscribed

  52. def merge[T](source: Publisher[Publisher[_ <: T]], concurrency: Int, prefetch: Int): Flux[T]

    Permalink

    Merge emitted Publisher sequences by the passed Publisher into an interleaved merged sequence.

    Merge emitted Publisher sequences by the passed Publisher into an interleaved merged sequence.

    T

    the merged type

    source

    a Publisher of Publisher sequence to merge

    concurrency

    the request produced to the main source thus limiting concurrent merge backlog

    prefetch

    the inner source request size

    returns

    a merged Flux

  53. def merge[T](source: Publisher[Publisher[_ <: T]], concurrency: Int): Flux[T]

    Permalink

    Merge emitted Publisher sequences by the passed Publisher into an interleaved merged sequence.

    Merge emitted Publisher sequences by the passed Publisher into an interleaved merged sequence.

    T

    the merged type

    source

    a Publisher of Publisher sequence to merge

    concurrency

    the request produced to the main source thus limiting concurrent merge backlog

    returns

    a merged Flux

  54. def merge[T](source: Publisher[Publisher[_ <: T]]): Flux[T]

    Permalink

    Merge emitted Publisher sequences by the passed Publisher into an interleaved merged sequence.

    Merge emitted Publisher sequences by the passed Publisher into an interleaved merged sequence.

    T

    the merged type

    source

    a Publisher of Publisher sequence to merge

    returns

    a merged Flux

  55. def mergeDelayError[I](prefetch: Int, sources: Publisher[_ <: I]*): Flux[I]

    Permalink

    Merge emitted Publisher sequences from the passed Publisher array into an interleaved merged sequence.

    Merge emitted Publisher sequences from the passed Publisher array into an interleaved merged sequence.

    I

    The source type of the data sequence

    prefetch

    the inner source request size

    sources

    the Publisher array to iterate on Publisher.subscribe

    returns

    a fresh Reactive Flux publisher ready to be subscribed

  56. def mergeSequential[I](sources: Iterable[Publisher[_ <: I]], maxConcurrency: Int, prefetch: Int): Flux[I]

    Permalink

    Merge Publisher sequences from an Iterable into an ordered merged sequence.

    Merge Publisher sequences from an Iterable into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order.

    I

    the merged type

    sources

    an Iterable of Publisher sequences to merge

    maxConcurrency

    the request produced to the main source thus limiting concurrent merge backlog

    prefetch

    the inner source request size

    returns

    a merged Flux, subscribing early but keeping the original ordering

  57. def mergeSequential[I](sources: Iterable[Publisher[_ <: I]]): Flux[I]

    Permalink

    Merge Publisher sequences from an Iterable into an ordered merged sequence.

    Merge Publisher sequences from an Iterable into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order.

    I

    the merged type

    sources

    an Iterable of Publisher sequences to merge

    returns

    a merged Flux, subscribing early but keeping the original ordering

  58. def mergeSequential[I](prefetch: Int, sources: Publisher[_ <: I]*): Flux[I]

    Permalink

    Merge a number of Publisher sequences into an ordered merged sequence.

    Merge a number of Publisher sequences into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order.

    I

    the merged type

    prefetch

    the inner source request size

    sources

    a number of Publisher sequences to merge

    returns

    a merged Flux, subscribing early but keeping the original ordering

  59. def mergeSequential[I](sources: Publisher[_ <: I]*): Flux[I]

    Permalink

    Merge a number of Publisher sequences into an ordered merged sequence.

    Merge a number of Publisher sequences into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order.

    I

    the merged type

    sources

    a number of Publisher sequences to merge

    returns

    a merged Flux

  60. def mergeSequential[T](sources: Publisher[_ <: Publisher[_ <: T]], maxConcurrency: Int, prefetch: Int): Flux[T]

    Permalink

    Merge emitted Publisher sequences by the passed Publisher into an ordered merged sequence.

    Merge emitted Publisher sequences by the passed Publisher into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order.

    T

    the merged type

    sources

    a Publisher of Publisher sequence to merge

    maxConcurrency

    the request produced to the main source thus limiting concurrent merge backlog

    prefetch

    the inner source request size

    returns

    a merged Flux

  61. def mergeSequential[T](sources: Publisher[Publisher[T]]): Flux[T]

    Permalink

    Merge emitted Publisher sequences by the passed Publisher into an ordered merged sequence.

    Merge emitted Publisher sequences by the passed Publisher into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order.

    T

    the merged type

    sources

    a Publisher of Publisher sequence to merge

    returns

    a merged Flux

  62. def mergeSequentialDelayError[I](sources: Iterable[Publisher[_ <: I]], maxConcurrency: Int, prefetch: Int): Flux[I]

    Permalink

    Merge Publisher sequences from an Iterable into an ordered merged sequence.

    Merge Publisher sequences from an Iterable into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order.

    I

    the merged type

    sources

    an Iterable of Publisher sequences to merge

    maxConcurrency

    the request produced to the main source thus limiting concurrent merge backlog

    prefetch

    the inner source request size

    returns

    a merged Flux

  63. def mergeSequentialDelayError[I](prefetch: Int, sources: Publisher[_ <: I]*): Flux[I]

    Permalink

    Merge a number of Publisher sequences into an ordered merged sequence.

    Merge a number of Publisher sequences into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order.

    I

    the merged type

    prefetch

    the inner source request size

    sources

    a number of Publisher sequences to merge

    returns

    a merged Flux

  64. def mergeSequentialDelayError[T](sources: Publisher[_ <: Publisher[_ <: T]], maxConcurrency: Int, prefetch: Int): Flux[T]

    Permalink

    Merge emitted Publisher sequences by the passed Publisher into an ordered merged sequence.

    Merge emitted Publisher sequences by the passed Publisher into an ordered merged sequence. Unlike concat, the inner publishers are subscribed to eagerly. Unlike merge, their emitted values are merged into the final sequence in subscription order. This variant will delay any error until after the rest of the mergeSequential backlog has been processed.

    T

    the merged type

    sources

    a Publisher of Publisher sequence to merge

    maxConcurrency

    the request produced to the main source thus limiting concurrent merge backlog

    prefetch

    the inner source request size

    returns

    a merged Flux, subscribing early but keeping the original ordering

  65. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  66. def never[T](): Flux[T]

    Permalink

    Create a Flux that will never signal any data, error or completion signal.

    Create a Flux that will never signal any data, error or completion signal.

    T

    the Subscriber type target

    returns

    a never completing Flux

  67. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  68. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  69. def push[T](emitter: (FluxSink[T]) ⇒ Unit, backpressure: OverflowStrategy): Flux[T]

    Permalink

    Creates a Flux with multi-emission capabilities from a single threaded producer through the FluxSink API.

    Creates a Flux with multi-emission capabilities from a single threaded producer through the FluxSink API.

    This Flux factory is useful if one wants to adapt some other single-threaded multi-valued async API and not worry about cancellation and backpressure. For example:

    
    Flux.push[String](emitter => {
    
    val al: ActionListener = e => {
    	 emitter.next(textField.getText())
    };
    // without cleanup support:
    
     button.addActionListener(al)
    
    // with cleanup support:
    
     button.addActionListener(al)
     emitter.onDispose(() => {
    	 button.removeListener(al)
    });
    }, FluxSink.OverflowStrategy.LATEST)
    
    

    T

    the value type

    emitter

    the consumer that will receive a FluxSink for each individual Subscriber.

    backpressure

    the backpressure mode, see OverflowStrategy for the available backpressure modes

    returns

    a Flux

  70. def push[T](emitter: (FluxSink[T]) ⇒ Unit): Flux[T]

    Permalink

    Creates a Flux with multi-emission capabilities from a single threaded producer through the FluxSink API.

    Creates a Flux with multi-emission capabilities from a single threaded producer through the FluxSink API.

    This Flux factory is useful if one wants to adapt some other single=threaded multi-valued async API and not worry about cancellation and backpressure. For example:

    
    Flux.push[String](emitter => {
    
    val al: ActionListener = e => {
    	 emitter.next(textField.getText())
    }
    // without cleanup support:
    
     button.addActionListener(al)
    
    // with cleanup support:
    
     button.addActionListener(al)
     emitter.onDispose(() => {
    	 button.removeListener(al)
    })
    }, FluxSink.OverflowStrategy.LATEST);
    
    

    T

    the value type

    emitter

    the consumer that will receive a FluxSink for each individual Subscriber.

    returns

    a Flux

  71. def range(start: Int, count: Int): Flux[Integer]

    Permalink

    Build a Flux that will only emit a sequence of incrementing integer from start to start + count then complete.

    Build a Flux that will only emit a sequence of incrementing integer from start to start + count then complete.

    start

    the first integer to be emit

    count

    the number ot times to emit an increment including the first value

    returns

    a ranged Flux

  72. def switchOnNext[T](mergedPublishers: Publisher[Publisher[_ <: T]], prefetch: Int): Flux[T]

    Permalink

    Build a reactor.core.publisher.FluxProcessor whose data are emitted by the most recent emitted Publisher.

    Build a reactor.core.publisher.FluxProcessor whose data are emitted by the most recent emitted Publisher. The Flux will complete once both the publishers source and the last switched to Publisher have completed.

    T

    the produced type

    mergedPublishers

    The { @link Publisher} of switching { @link Publisher} to subscribe to.

    prefetch

    the inner source request size

    returns

    a reactor.core.publisher.FluxProcessor accepting publishers and producing T

  73. def switchOnNext[T](mergedPublishers: Publisher[Publisher[_ <: T]]): Flux[T]

    Permalink

    Build a reactor.core.publisher.FluxProcessor whose data are emitted by the most recent emitted Publisher.

    Build a reactor.core.publisher.FluxProcessor whose data are emitted by the most recent emitted Publisher. The Flux will complete once both the publishers source and the last switched to Publisher have completed.

    T

    the produced type

    mergedPublishers

    The { @link Publisher} of switching Publisher to subscribe to.

    returns

    a reactor.core.publisher.FluxProcessor accepting publishers and producing T

  74. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  75. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  76. def using[T, D](resourceSupplier: () ⇒ D, sourceSupplier: (D) ⇒ Publisher[_ <: T], resourceCleanup: (D) ⇒ Unit, eager: Boolean): Flux[T]

    Permalink

    Uses a resource, generated by a supplier for each individual Subscriber, while streaming the values from a Publisher derived from the same resource and makes sure the resource is released if the sequence terminates or the Subscriber cancels.

    Uses a resource, generated by a supplier for each individual Subscriber, while streaming the values from a Publisher derived from the same resource and makes sure the resource is released if the sequence terminates or the Subscriber cancels.

    • Eager resource cleanup happens just before the source termination and exceptions raised by the cleanup Consumer may override the terminal even.
    • Non-eager cleanup will drop any exception.

    T

    emitted type

    D

    resource type

    resourceSupplier

    a java.util.concurrent.Callable that is called on subscribe

    sourceSupplier

    a Publisher factory derived from the supplied resource

    resourceCleanup

    invoked on completion

    eager

    true to clean before terminating downstream subscribers

    returns

    new Stream

  77. def using[T, D](resourceSupplier: () ⇒ D, sourceSupplier: (D) ⇒ Publisher[_ <: T], resourceCleanup: (D) ⇒ Unit): Flux[T]

    Permalink

    Uses a resource, generated by a supplier for each individual Subscriber, while streaming the values from a Publisher derived from the same resource and makes sure the resource is released if the sequence terminates or the Subscriber cancels.

    Uses a resource, generated by a supplier for each individual Subscriber, while streaming the values from a Publisher derived from the same resource and makes sure the resource is released if the sequence terminates or the Subscriber cancels.

    Eager resource cleanup happens just before the source termination and exceptions raised by the cleanup Consumer may override the terminal even.

    T

    emitted type

    D

    resource type

    resourceSupplier

    a java.util.concurrent.Callable that is called on subscribe

    sourceSupplier

    a Publisher factory derived from the supplied resource

    resourceCleanup

    invoked on completion

    returns

    new Flux

  78. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  79. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  80. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  81. def zip[I, O](combinator: (Array[AnyRef]) ⇒ O, prefetch: Int, sources: Publisher[_ <: I]*): Flux[O]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations produced by the passed combinator function of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    I

    the type of the input sources

    O

    the combined produced type

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    prefetch

    individual source request size

    sources

    the Publisher array to iterate on Publisher.subscribe

    returns

    a zipped Flux

  82. def zip[I, O](combinator: (Array[AnyRef]) ⇒ O, sources: Publisher[_ <: I]*): Flux[O]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations produced by the passed combinator function of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    I

    the type of the input sources

    O

    the combined produced type

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    sources

    the Publisher array to iterate on Publisher.subscribe

    returns

    a zipped Flux

  83. def zip[O](sources: Iterable[_ <: Publisher[_]], prefetch: Int, combinator: (Array[_]) ⇒ O): Flux[O]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations produced by the passed combinator function of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    The Iterable.iterator will be called on each Publisher.subscribe.

    O

    the combined produced type

    sources

    the Iterable to iterate on Publisher.subscribe

    prefetch

    the inner source request size

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a zipped Flux

  84. def zip[O](sources: Iterable[_ <: Publisher[_]], combinator: (Array[_]) ⇒ O): Flux[O]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations produced by the passed combinator function of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    The Iterable.iterator will be called on each Publisher.subscribe.

    O

    the combined produced type

    sources

    the Iterable to iterate on Publisher.subscribe

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a zipped Flux

  85. def zip[T1, T2, T3, T4, T5, T6](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], source3: Publisher[_ <: T3], source4: Publisher[_ <: T4], source5: Publisher[_ <: T5], source6: Publisher[_ <: T6]): Flux[(T1, T2, T3, T4, T5, T6)]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    T1

    type of the value from source1

    T2

    type of the value from source2

    T3

    type of the value from source3

    T4

    type of the value from source4

    T5

    type of the value from source5

    T6

    type of the value from source6

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    source3

    The third upstream Publisher to subscribe to.

    source4

    The fourth upstream Publisher to subscribe to.

    source5

    The fifth upstream Publisher to subscribe to.

    source6

    The sixth upstream Publisher to subscribe to.

    returns

    a zipped Flux

  86. def zip[T1, T2, T3, T4, T5](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], source3: Publisher[_ <: T3], source4: Publisher[_ <: T4], source5: Publisher[_ <: T5]): Flux[(T1, T2, T3, T4, T5)]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    T1

    type of the value from source1

    T2

    type of the value from source2

    T3

    type of the value from source3

    T4

    type of the value from source4

    T5

    type of the value from source5

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    source3

    The third upstream Publisher to subscribe to.

    source4

    The fourth upstream Publisher to subscribe to.

    source5

    The fifth upstream Publisher to subscribe to.

    returns

    a zipped Flux

  87. def zip[T1, T2, T3, T4](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], source3: Publisher[_ <: T3], source4: Publisher[_ <: T4]): Flux[(T1, T2, T3, T4)]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    T1

    type of the value from source1

    T2

    type of the value from source2

    T3

    type of the value from source3

    T4

    type of the value from source4

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    source3

    The third upstream Publisher to subscribe to.

    source4

    The fourth upstream Publisher to subscribe to.

    returns

    a zipped Flux

  88. def zip[T1, T2, T3](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], source3: Publisher[_ <: T3]): Flux[(T1, T2, T3)]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    T1

    type of the value from source1

    T2

    type of the value from source2

    T3

    type of the value from source3

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    source3

    The third upstream Publisher to subscribe to.

    returns

    a zipped Flux

  89. def zip[T1, T2](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2]): Flux[(T1, T2)]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    T1

    type of the value from source1

    T2

    type of the value from source2

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    returns

    a zipped Flux

  90. def zip[T1, T2, O](source1: Publisher[_ <: T1], source2: Publisher[_ <: T2], combinator: (T1, T2) ⇒ O): Flux[O]

    Permalink

    "Step-Merge" especially useful in Scatter-Gather scenarios.

    "Step-Merge" especially useful in Scatter-Gather scenarios. The operator will forward all combinations produced by the passed combinator function of the most recent items emitted by each source until any of them completes. Errors will immediately be forwarded.

    T1

    type of the value from source1

    T2

    type of the value from source2

    O

    The produced output after transformation by the combinator

    source1

    The first upstream Publisher to subscribe to.

    source2

    The second upstream Publisher to subscribe to.

    combinator

    The aggregate function that will receive a unique value from each upstream and return the value to signal downstream

    returns

    a zipped Flux

Inherited from AnyRef

Inherited from Any

Ungrouped