Package

monix

eval

Permalink

package eval

Visibility
  1. Public
  2. All

Type Members

  1. abstract class Callback[-A] extends Listener[A] with (Try[A]) ⇒ Unit

    Permalink

    Represents a callback that should be called asynchronously with the result of a computation.

    Represents a callback that should be called asynchronously with the result of a computation. Used by Task to signal the completion of asynchronous computations on runAsync.

    The onSuccess method should be called only once, with the successful result, whereas onError should be called if the result is an error.

  2. sealed abstract class Coeval[+A] extends () ⇒ A with Serializable

    Permalink

    Coeval represents lazy computations that can execute synchronously.

    Coeval represents lazy computations that can execute synchronously.

    Word definition and origin:

    • Having the same age or date of origin; a contemporary; synchronous.
    • From the Latin "coævus": com- ‎("equal") in combination with aevum ‎(aevum, "age").
    • The constructor of Coeval is the dual of an expression that evaluates to an A.

    There are three evaluation strategies:

    • now or raiseError: for describing strict values, evaluated immediately
    • evalOnce: expressions evaluated a single time
    • eval: expressions evaluated every time the value is needed

    The Once and Always are both lazy strategies while Now and Error are eager. Once and Always are distinguished from each other only by memoization: once evaluated Once will save the value to be returned immediately if it is needed again. Always will run its computation every time.

    Both Now and Error are represented by the Eager trait, a sub-type of Coeval that can be used as a replacement for Scala's own Try type.

    Coeval supports stack-safe lazy computation via the .map and .flatMap methods, which use an internal trampoline to avoid stack overflows. Computations done within .map and .flatMap are always lazy, even when applied to a Coeval.Eager instance (e.g. Coeval.Now, Coeval.Error).

    Evaluation Strategies

    The "now" and "raiseError" builders are building Coeval instances out of strict values:

    val fa = Coeval.now(1)
    fa.value //=> 1
    
    val fe = Coeval.raiseError(new DummyException("dummy"))
    fe.value //=> throws DummyException

    The "always" strategy is equivalent with a plain function:

    // For didactic purposes, don't use shared vars at home :-)
    var i = 0
    val fa = Coeval.eval { i += 1; i }
    
    fa.value //=> 1
    fa.value //=> 2
    fa.value //=> 3

    The "once" strategy is equivalent with Scala's lazy val (along with thread-safe idempotency guarantees):

    var i = 0
    val fa = Coeval.evalOnce { i += 1; i }
    
    fa.value //=> 1
    fa.value //=> 1
    fa.value //=> 1

    Versus Task

    The other option of suspending side-effects is Task. As a quick comparison:

    • Coeval's execution is always immediate / synchronous, whereas Task can describe asynchronous computations
    • Coeval is not cancelable, obviously, since execution is immediate and there's nothing to cancel

    Versus cats.Eval

    The Coeval data type is very similar with cats.Eval. As a quick comparison:

    • cats.Eval is only for controlling laziness, but it doesn't handle side effects, hence cats.Eval is a Comonad
    • Monix's Coeval can handle side effects as well and thus it implements MonadError[Coeval, Throwable] and cats.effect.Sync, providing error-handling utilities

    If you just want to delay the evaluation of a pure expression use cats.Eval, but if you need to suspend side effects or you need error handling capabilities, then use Coeval.

  3. abstract class MVar[A] extends AnyRef

    Permalink

    A mutable location, that is either empty or contains a value of type A.

    A mutable location, that is either empty or contains a value of type A.

    It has 2 fundamental atomic operations:

    • put which fills the var if empty, or blocks (asynchronously) until the var is empty again
    • take which empties the var if full, returning the contained value, or blocks (asynchronously) otherwise until there is a value to pull

    The MVar is appropriate for building synchronization primitives and performing simple inter-thread communications. If it helps, it's similar with a BlockingQueue(capacity = 1), except that it doesn't block any threads, all waiting being done asynchronously by means of Task.

    Given its asynchronous, non-blocking nature, it can be used on top of Javascript as well.

    Inspired by Control.Concurrent.MVar from Haskell and by scalaz.concurrent.MVar.

  4. sealed abstract class Task[+A] extends Serializable

    Permalink

    Task represents a specification for a possibly lazy or asynchronous computation, which when executed will produce an A as a result, along with possible side-effects.

    Task represents a specification for a possibly lazy or asynchronous computation, which when executed will produce an A as a result, along with possible side-effects.

    Compared with Future from Scala's standard library, Task does not represent a running computation or a value detached from time, as Task does not execute anything when working with its builders or operators and it does not submit any work into any thread-pool, the execution eventually taking place only after runAsync is called and not before that.

    Note that Task is conservative in how it spawns logical threads. Transformations like map and flatMap for example will default to being executed on the logical thread on which the asynchronous computation was started. But one shouldn't make assumptions about how things will end up executed, as ultimately it is the implementation's job to decide on the best execution model. All you are guaranteed is asynchronous execution after executing runAsync.

    Getting Started

    To build a Task from a by-name parameters (thunks), we can use Task.eval or Task.apply:

    val hello = Task.eval("Hello ")
    val world = Task("World!")

    Nothing gets executed yet, as Task is lazy, nothing executes until you trigger .runAsync on it.

    To combine Task values we can use .map and .flatMap, which describe sequencing and this time it's in a very real sense because of the laziness involved:

    val sayHello = hello
      .flatMap(h => world.map(w => h + w))
      .map(println)

    This Task reference will trigger a side effect on evaluation, but not yet. To make the above print its message:

    import monix.execution.CancelableFuture
    
    val f: CancelableFuture[Unit] = sayHello.run()
    //=> Hello World!

    The returned type is a CancelableFuture which inherits from Scala's standard Future, a value that can be completed already or might be completed at some point in the future, once the running asynchronous process finishes. Such a future value can also be canceled, see below.

    Laziness

    The fact that Task is lazy whereas Future is not has real consequences. For example with Task you can do this:

    def retryOnFailure[A](times: Int, source: Task[A]): Task[A] =
      source.recoverWith { err =>
        // No more retries left? Re-throw error:
        if (times <= 0) Task.raise(err) else {
          // Recursive call, yes we can!
          retryOnFailure(times - 1, source)
            // Adding 500 ms delay for good measure
            .delayExecution(500)
        }
      }

    Future being a strict value-wannabe means that the actual value gets "memoized" (means cached), however Task is basically a function that can be repeated for as many times as you want. Task can also do memoization of course:

    task.memoize

    The difference between this and just calling runAsync() is that memoize() still returns a Task and the actual memoization happens on the first runAsync() (with idempotency guarantees of course).

    But here's something else that the Future data type cannot do:

    task.memoizeOnSuccess

    This keeps repeating the computation for as long as the result is a failure and caches it only on success. Yes we can!

    Parallelism

    Because of laziness, invoking Task.sequence will not work like it does for Future.sequence, the given Task values being evaluated one after another, in sequence, not in parallel. If you want parallelism, then you need to use Task.gather and thus be explicit about it.

    This is great because it gives you the possibility of fine tuning the execution. For example, say you want to execute things in parallel, but with a maximum limit of 30 tasks being executed in parallel. One way of doing that is to process your list in batches:

    // Some array of tasks, you come up with something good :-)
    val list: Seq[Task[Int]] = ???
    
    // Split our list in chunks of 30 items per chunk,
    // this being the maximum parallelism allowed
    val chunks = list.sliding(30, 30)
    
    // Specify that each batch should process stuff in parallel
    val batchedTasks = chunks.map(chunk => Task.gather(chunk))
    // Sequence the batches
    val allBatches = Task.sequence(batchedTasks)
    
    // Flatten the result, within the context of Task
    val all: Task[Seq[Int]] = allBatches.map(_.flatten)

    Note that the built Task reference is just a specification at this point, or you can view it as a function, as nothing has executed yet, you need to call .runAsync explicitly.

    Cancellation

    The logic described by an Task task could be cancelable, depending on how the Task gets built.

    CancelableFuture references can also be canceled, in case the described computation can be canceled. When describing Task tasks with Task.eval nothing can be cancelled, since there's nothing about a plain function that you can cancel, but we can build cancelable tasks with Task.async (alias Task.create):

    import scala.concurrent.duration._
    
    val delayedHello = Task.async { (scheduler, callback) =>
      val task = scheduler.scheduleOnce(1.second) {
        println("Delayed Hello!")
        // Signaling successful completion
        callback(Success(()))
      }
    
      Cancelable { () => {
        println("Cancelling!")
        task.cancel()
      }
    }

    The sample above prints a message with a delay, where the delay itself is scheduled with the injected Scheduler. The Scheduler is in fact an implicit parameter to runAsync().

    This action can be cancelled, because it specifies cancellation logic. In case we have no cancelable logic to express, then it's OK if we returned a Cancelable.empty reference, in which case the resulting Task would not be cancelable.

    But the Task we just described is cancelable:

    // Triggering execution
    val f: CancelableFuture[Unit] = delayedHello.run()
    
    // If we change our mind before the timespan has passed:
    f.cancel()

    Also, given an Task task, we can specify actions that need to be triggered in case of cancellation:

    val task = Task.eval(println("Hello!")).executeWithFork
    
    task.doOnCancel(Task.eval {
      println("A cancellation attempt was made!")
    }
    
    val f: CancelableFuture[Unit] = task.run()
    
    // Note that in this case cancelling the resulting Future
    // will not stop the actual execution, since it doesn't know
    // how, but it will trigger our on-cancel callback:
    
    f.cancel()
    //=> A cancellation attempt was made!

    Note on the ExecutionModel

    Task is conservative in how it introduces async boundaries. Transformations like map and flatMap for example will default to being executed on the current call stack on which the asynchronous computation was started. But one shouldn't make assumptions about how things will end up executed, as ultimately it is the implementation's job to decide on the best execution model. All you are guaranteed (and can assume) is asynchronous execution after executing runAsync().

    Currently the default ExecutionModel specifies batched execution by default and Task in its evaluation respects the injected ExecutionModel. If you want a different behavior, you need to execute the Task reference with a different scheduler.

  5. trait TaskApp extends AnyRef

    Permalink

    Safe App type that runs a Task action.

    Safe App type that runs a Task action.

    Clients should implement run, runl, or runc.

    Also available for Scala.js, but without the ability to take arguments and without the blocking in main.

  6. final class TaskCircuitBreaker extends AnyRef

    Permalink

    The TaskCircuitBreaker is used to provide stability and prevent cascading failures in distributed systems.

    The TaskCircuitBreaker is used to provide stability and prevent cascading failures in distributed systems.

    Purpose

    As an example, we have a web application interacting with a remote third party web service. Let's say the third party has oversold their capacity and their database melts down under load. Assume that the database fails in such a way that it takes a very long time to hand back an error to the third party web service. This in turn makes calls fail after a long period of time. Back to our web application, the users have noticed that their form submissions take much longer seeming to hang. Well the users do what they know to do which is use the refresh button, adding more requests to their already running requests. This eventually causes the failure of the web application due to resource exhaustion. This will affect all users, even those who are not using functionality dependent on this third party web service.

    Introducing circuit breakers on the web service call would cause the requests to begin to fail-fast, letting the user know that something is wrong and that they need not refresh their request. This also confines the failure behavior to only those users that are using functionality dependent on the third party, other users are no longer affected as there is no resource exhaustion. Circuit breakers can also allow savvy developers to mark portions of the site that use the functionality unavailable, or perhaps show some cached content as appropriate while the breaker is open.

    How It Works

    The circuit breaker models a concurrent state machine that can be in any of these 3 states:

    1. Closed: During normal operations or when the TaskCircuitBreaker starts
      • Exceptions increment the failures counter
      • Successes reset the failure count to zero
      • When the failures counter reaches the maxFailures count, the breaker is tripped into Open state
    2. Open: The circuit breaker rejects all tasks with an ExecutionRejectedException
      • all tasks fail fast with ExecutionRejectedException
      • after the configured resetTimeout, the circuit breaker enters a HalfOpen state, allowing one task to go through for testing the connection
    3. HalfOpen: The circuit breaker has already allowed a task to go through, as a reset attempt, in order to test the connection
      • The first task when Open has expired is allowed through without failing fast, just before the circuit breaker is evolved into the HalfOpen state
      • All tasks attempted in HalfOpen fail-fast with an exception just as in Open state
      • If that task attempt succeeds, the breaker is reset back to the Closed state, with the resetTimeout and the failures count also reset to initial values
      • If the first call fails, the breaker is tripped again into the Open state (the resetTimeout is multiplied by the exponential backoff factor)

    Usage

    import monix.eval._
    import scala.concurrent.duration._
    
    val circuitBreaker = TaskCircuitBreaker(
      maxFailures = 5,
      resetTimeout = 10.seconds
    )
    
    //...
    val problematic = Task {
      val nr = util.Random.nextInt()
      if (nr % 2 == 0) nr else
        throw new RuntimeException("dummy")
    }
    
    val task = circuitBreaker.protect(problematic)

    When attempting to close the circuit breaker and resume normal operations, we can also apply an exponential backoff for repeated failed attempts, like so:

    val circuitBreaker = TaskCircuitBreaker(
      maxFailures = 5,
      resetTimeout = 10.seconds,
      exponentialBackoffFactor = 2,
      maxResetTimeout = 10.minutes
    )

    In this sample we attempt to reconnect after 10 seconds, then after 20, 40 and so on, a delay that keeps increasing up to a configurable maximum of 10 minutes.

    Credits

    This Monix data type was inspired by the availability of Akka's Circuit Breaker.

  7. final class TaskSemaphore extends Serializable

    Permalink

    The TaskSemaphore is an asynchronous semaphore implementation that limits the parallelism on task execution.

    The TaskSemaphore is an asynchronous semaphore implementation that limits the parallelism on task execution.

    The following example instantiates a semaphore with a maximum parallelism of 10:

    val semaphore = TaskSemaphore(maxParallelism = 10)
    
    def makeRequest(r: HttpRequest): Task[HttpResponse] = ???
    
    // For such a task no more than 10 requests
    // are allowed to be executed in parallel.
    val task = semaphore.greenLight(makeRequest(???))

Value Members

  1. object Callback extends Serializable

    Permalink
  2. object Coeval extends Serializable

    Permalink

    Coeval builders.

  3. object MVar

    Permalink
  4. object Task extends TaskInstances with Serializable

    Permalink

    Builders for Task.

  5. object TaskCircuitBreaker

    Permalink
  6. object TaskSemaphore extends Serializable

    Permalink
  7. package instances

    Permalink

Ungrouped