- Companion:
- class
Type members
Classlikes
Value members
Concrete methods
Aggregates elements of this stream using the provided sink for as long as the downstream operators on the stream are busy.
Aggregates elements of this stream using the provided sink for as long as the downstream operators on the stream are busy.
This operator divides the stream into two asynchronous "islands". Operators upstream of this operator run on one fiber, while downstream operators run on another. Whenever the downstream fiber is busy processing elements, the upstream fiber will feed elements into the sink until it signals completion.
Any sink can be used here, but see ZSink.foldWeightedZIO and ZSink.foldUntilZIO for sinks that cover the common usecases.
Like aggregateAsyncWithinEither
, but only returns the Right
results.
Like aggregateAsyncWithinEither
, but only returns the Right
results.
Aggregates elements using the provided sink until it completes, or until the delay signalled by the schedule has passed.
Aggregates elements using the provided sink until it completes, or until the delay signalled by the schedule has passed.
This operator divides the stream into two asynchronous islands. Operators upstream of this operator run on one fiber, while downstream operators run on another. Elements will be aggregated by the sink until the downstream fiber pulls the aggregated value, or until the schedule's delay has passed.
Aggregated elements will be fed into the schedule to determine the delays between pulls.
A shorter version of ZPipeline.identity, which can facilitate more compact definition of pipelines.
A shorter version of ZPipeline.identity, which can facilitate more compact definition of pipelines.
ZPipeline[Int] >>> ZPipeline.filter(_ % 2 != 0)
A dynamic pipeline that first collects n
elements from the stream, then
creates another pipeline with the function f
and sends all the following
elements through that.
A dynamic pipeline that first collects n
elements from the stream, then
creates another pipeline with the function f
and sends all the following
elements through that.
Creates a pipeline that exposes the chunk structure of the stream.
Creates a pipeline that exposes the chunk structure of the stream.
Creates a pipeline that collects elements with the specified partial function.
Creates a pipeline that collects elements with the specified partial function.
ZPipeline.collect[Option[Int], Int] { case Some(v) => v }
Delays the emission of values by holding new values for a set duration. If no new values arrive during that time the value is emitted, however if a new value is received during the holding period the previous value is discarded and the process is repeated with the new value.
Delays the emission of values by holding new values for a set duration. If no new values arrive during that time the value is emitted, however if a new value is received during the holding period the previous value is discarded and the process is repeated with the new value.
This operator is useful if you have a stream of "bursty" events which eventually settle down and you only need the final event of the burst.
- Example:
A search engine may only want to initiate a search after a user has paused typing so as to not prematurely recommend results.
Creates a pipeline that decodes a stream of bytes into a stream of characters using the given charset
Creates a pipeline that decodes a stream of bytes into a stream of characters using the given charset
Creates a pipeline that decodes a stream of bytes into a stream of strings using the given charset
Creates a pipeline that decodes a stream of bytes into a stream of strings using the given charset
Creates a pipeline that drops elements until the specified predicate evaluates to true.
Creates a pipeline that drops elements until the specified predicate evaluates to true.
ZPipeline.dropUntil[Int](_ > 100)
Drops incoming elements until the effectful predicate p
is satisfied.
Drops incoming elements until the effectful predicate p
is satisfied.
Creates a pipeline that drops elements while the specified predicate evaluates to true.
Creates a pipeline that drops elements while the specified predicate evaluates to true.
ZPipeline.dropWhile[Int](_ <= 100)
Drops incoming elements as long as the effectful predicate p
is
satisfied.
Drops incoming elements as long as the effectful predicate p
is
satisfied.
Creates a pipeline that converts a stream of characters into a stream of bytes using the given charset
Creates a pipeline that converts a stream of characters into a stream of bytes using the given charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the given charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the given charset
Accesses the environment of the pipeline in the context of a pipeline.
Accesses the environment of the pipeline in the context of a pipeline.
Creates a pipeline that filters elements according to the specified predicate.
Creates a pipeline that filters elements according to the specified predicate.
Creates a pipeline that submerges chunks into the structure of the stream.
Creates a pipeline that submerges chunks into the structure of the stream.
Creates a pipeline that converts exit results into a stream of values with failure terminating the stream.
Creates a pipeline that converts exit results into a stream of values with failure terminating the stream.
Creates a pipeline that submerges iterables into the structure of the stream.
Creates a pipeline that submerges iterables into the structure of the stream.
Creates a pipeline that flattens a stream of streams into a single stream of values. The streams are merged in parallel up to the specified maximum concurrency and will buffer up to output buffer size elements.
Creates a pipeline that flattens a stream of streams into a single stream of values. The streams are merged in parallel up to the specified maximum concurrency and will buffer up to output buffer size elements.
Creates a pipeline that sends all the elements through the given channel.
Creates a pipeline that sends all the elements through the given channel.
Creates a pipeline from a chunk processing function.
Creates a pipeline from a chunk processing function.
Creates a pipeline that repeatedly sends all elements through the given sink.
Creates a pipeline that repeatedly sends all elements through the given sink.
Creates a pipeline that groups on adjacent keys, calculated by function f.
Creates a pipeline that groups on adjacent keys, calculated by function f.
Partitions the stream with the specified chunkSize or until the specified duration has passed, whichever is satisfied first.
Partitions the stream with the specified chunkSize or until the specified duration has passed, whichever is satisfied first.
The identity pipeline, which does not modify streams in any way.
The identity pipeline, which does not modify streams in any way.
Creates a pipeline that converts a stream of bytes into a stream of strings using the ISO_8859_1 charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the ISO_8859_1 charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the ISO_8859_1 charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the ISO_8859_1 charset
Creates a pipeline that maps elements with the specified function.
Creates a pipeline that maps elements with the specified function.
Creates a pipeline that statefully maps elements with the specified function.
Creates a pipeline that statefully maps elements with the specified function.
Creates a pipeline that statefully maps elements with the specified effect.
Creates a pipeline that statefully maps elements with the specified effect.
Creates a pipeline that maps chunks of elements with the specified function.
Creates a pipeline that maps chunks of elements with the specified function.
Creates a pipeline that maps chunks of elements with the specified effect.
Creates a pipeline that maps chunks of elements with the specified effect.
Creates a pipeline that maps elements with the specified function that returns a stream.
Creates a pipeline that maps elements with the specified function that returns a stream.
Creates a pipeline that maps elements with the specified effectful function.
Creates a pipeline that maps elements with the specified effectful function.
Emits the provided chunk before emitting any other value.
Emits the provided chunk before emitting any other value.
A pipeline that rechunks the stream into chunks of the specified size.
A pipeline that rechunks the stream into chunks of the specified size.
Creates a pipeline that scans elements with the specified function.
Creates a pipeline that scans elements with the specified function.
Creates a pipeline that scans elements with the specified function.
Creates a pipeline that scans elements with the specified function.
Accesses the specified service in the environment of the pipeline in the context of a pipeline.
Accesses the specified service in the environment of the pipeline in the context of a pipeline.
Splits strings on newlines. Handles both Windows newlines (\r\n
) and UNIX
newlines (\n
).
Splits strings on newlines. Handles both Windows newlines (\r\n
) and UNIX
newlines (\n
).
Splits strings on a delimiter.
Splits strings on a delimiter.
Creates a pipeline that takes elements until the specified predicate evaluates to true.
Creates a pipeline that takes elements until the specified predicate evaluates to true.
Creates a pipeline that takes elements while the specified predicate evaluates to true.
Creates a pipeline that takes elements while the specified predicate evaluates to true.
Adds an effect to consumption of every element of the pipeline.
Adds an effect to consumption of every element of the pipeline.
Throttles the chunks of this pipeline according to the given bandwidth
parameters using the token bucket algorithm. Allows for burst in the
processing of elements by allowing the token bucket to accumulate tokens up
to a units + burst
threshold. Chunks that do not meet the bandwidth
constraints are dropped. The weight of each chunk is determined by the
costFn
function.
Throttles the chunks of this pipeline according to the given bandwidth
parameters using the token bucket algorithm. Allows for burst in the
processing of elements by allowing the token bucket to accumulate tokens up
to a units + burst
threshold. Chunks that do not meet the bandwidth
constraints are dropped. The weight of each chunk is determined by the
costFn
function.
Throttles the chunks of this pipeline according to the given bandwidth
parameters using the token bucket algorithm. Allows for burst in the
processing of elements by allowing the token bucket to accumulate tokens up
to a units + burst
threshold. Chunks that do not meet the bandwidth
constraints are dropped. The weight of each chunk is determined by the
costFn
effectful function.
Throttles the chunks of this pipeline according to the given bandwidth
parameters using the token bucket algorithm. Allows for burst in the
processing of elements by allowing the token bucket to accumulate tokens up
to a units + burst
threshold. Chunks that do not meet the bandwidth
constraints are dropped. The weight of each chunk is determined by the
costFn
effectful function.
Delays the chunks of this pipeline according to the given bandwidth
parameters using the token bucket algorithm. Allows for burst in the
processing of elements by allowing the token bucket to accumulate tokens up
to a units + burst
threshold. The weight of each chunk is determined by
the costFn
function.
Delays the chunks of this pipeline according to the given bandwidth
parameters using the token bucket algorithm. Allows for burst in the
processing of elements by allowing the token bucket to accumulate tokens up
to a units + burst
threshold. The weight of each chunk is determined by
the costFn
function.
Delays the chunks of this pipeline according to the given bandwidth
parameters using the token bucket algorithm. Allows for burst in the
processing of elements by allowing the token bucket to accumulate tokens up
to a units + burst
threshold. The weight of each chunk is determined by
the costFn
effectful function.
Delays the chunks of this pipeline according to the given bandwidth
parameters using the token bucket algorithm. Allows for burst in the
processing of elements by allowing the token bucket to accumulate tokens up
to a units + burst
threshold. The weight of each chunk is determined by
the costFn
effectful function.
Creates a pipeline that converts a stream of bytes into a stream of strings using the US ASCII charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the US ASCII charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the US ASCII charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the US ASCII charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_16BE charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_16BE charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16BE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16BE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16BE charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16BE charset prefixing it with a BOM
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_16 charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_16 charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16BE charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16BE charset prefixing it with a BOM
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_16LE charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_16LE charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16LE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16LE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16LE charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16LE charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16 charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_16 charset prefixing it with a BOM
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_32BE charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_32BE charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32BE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32BE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32BE charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32BE charset prefixing it with a BOM
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_32 charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_32 charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32BE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32BE charset, without adding a BOM
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_32LE charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_32LE charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32LE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32LE charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32LE charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32LE charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32BE charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_32BE charset prefixing it with a BOM
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_8 charset
Creates a pipeline that converts a stream of bytes into a stream of strings using the UTF_8 charset
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_8 charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_8 charset, without adding a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_8 charset prefixing it with a BOM
Creates a pipeline that converts a stream of strings into a stream of bytes using the UTF_8 charset prefixing it with a BOM
utfDecode determines the right encoder to use based on the Byte Order Mark
(BOM). If it doesn't detect one, it defaults to utf8Decode. In the case of
utf16 and utf32 without BOM, utf16Decode
and utf32Decode
should be used
instead as both default to their own default decoder respectively.
utfDecode determines the right encoder to use based on the Byte Order Mark
(BOM). If it doesn't detect one, it defaults to utf8Decode. In the case of
utf16 and utf32 without BOM, utf16Decode
and utf32Decode
should be used
instead as both default to their own default decoder respectively.
Zips this pipeline together with the index of elements.
Zips this pipeline together with the index of elements.
Zips each element with the next element if present.
Zips each element with the next element if present.