package core

Ordering
  1. Alphabetic
Visibility
  1. Public
  2. All

Type Members

  1. trait Activator [N] extends (N) ⇒ N with UFunc with MappingUFunc with Serializable

    The activator function with its derivative.

  2. case class Approximation (Δ: Double) extends Serializable with Product

    Approximates with precision Δ.

  3. trait BaseOps extends AnyRef
  4. trait Constructor [+T <: Network[_, _]] extends AnyRef

    A minimal constructor for a Network.

    A minimal constructor for a Network.

    Annotations
    @implicitNotFound( ... )
  5. case class Convolution (dimIn: (Int, Int, Int), field: (Int, Int), filters: Int, stride: Int, activator: Activator[Double]) extends Hidden with HasActivator[Double] with Layer with In with Product with Serializable

    Convolutes the input volume.

    Convolutes the input volume. Where: Input dimension dimIn as (width, height, depth). The receptive field as (width, height). How many independent filters are attached to the input. Sliding over the input volume using a stride. Applying the activator function element-wise on the output.

  6. trait ConvolutionalNetwork extends Network[core.Network.Matrices, core.Network.Vector]
  7. case class Debuggable () extends Update with Product with Serializable
  8. case class Dense (neurons: Int, activator: Activator[Double]) extends Layer with Hidden with HasActivator[Double] with Product with Serializable

    Dense layer carrying neurons with activator function.

  9. trait DistributedConvolutionalNetwork extends Network[core.Network.Matrices, core.Network.Vector] with DistributedTraining
  10. trait DistributedFeedForwardNetwork extends Network[core.Network.Vector, core.Network.Vector] with DistributedTraining
  11. trait DistributedTraining extends AnyRef
  12. case class EarlyStopping [In, Out](xs: Seq[In], ys: Seq[Out], factor: Double) extends Regularization with Product with Serializable

    Continuously computes the average error for given test input and output xs, ys.

    Continuously computes the average error for given test input and output xs, ys. If the error moves too far away from the best result, measured in terms of a distance factor, the training process will stop early to avoid over-training.

  13. trait EarlyStoppingLogic extends AnyRef
  14. trait EndsWith [L <: HList, +Predicate] extends AnyRef

    Type-class witnessing that the last item within HList L is Predicate.

    Type-class witnessing that the last item within HList L is Predicate.

    Annotations
    @implicitNotFound( ... )
  15. trait ErrorFuncGrapher extends AnyRef

    Since

    10.07.16

  16. case class ErrorFuncOutput (file: Option[String] = None, action: Option[(Double) ⇒ Unit] = None) extends Product with Serializable
  17. trait FeedForwardNetwork extends Network[core.Network.Vector, core.Network.Vector]
  18. case class Focus (inner: Layer with HasActivator[Double]) extends Layer with Product with Serializable

    Focus is used if the desired model output is not the Out layer, but a hidden one.

    Focus is used if the desired model output is not the Out layer, but a hidden one. (AutoEncoders, PCA, ...)

  19. trait HasActivator [N] extends AnyRef

    Label for neurons in the network performing a function on their synapses.

  20. sealed trait Hidden extends AnyRef
  21. trait IllusionBreaker extends AnyRef
  22. sealed trait In extends AnyRef
  23. case class Input (neurons: Int) extends Layer with In with Product with Serializable

    Dense input layer carrying neurons.

  24. trait KeepBestLogic extends AnyRef
  25. sealed trait Layer extends Serializable

    Base-label for all layers.

  26. case class Momentum (μ: Double = 0.9) extends Update with Product with Serializable
  27. trait Network [In, Out] extends (In) ⇒ Out with Logs with ErrorFuncGrapher with IllusionBreaker with Welcoming with Serializable
  28. case class Node (host: String, port: Int) extends Product with Serializable

    Distributed training node

  29. sealed trait Out extends AnyRef
  30. case class Output (neurons: Int, activator: Activator[Double]) extends Layer with HasActivator[Double] with Out with Product with Serializable

    Dense output layer carrying neurons with activator function.

  31. trait RecurrentNetwork extends Network[core.Network.Vectors, core.Network.Vectors]
  32. trait Regularization extends Serializable

    Marker trait for regulators

  33. case class Settings (verbose: Boolean = true, learningRate: Network.Learning = { case (_, _) => 1E-4 }, updateRule: Update = Vanilla, precision: Double = 1E-5, iterations: Int = 100, prettyPrint: Boolean = false, parallelism: Int = ..., coordinator: Node = Node("0.0.0.0", 2552), transport: Transport = Transport(100000, "128 MiB"), batchSize: Option[Int] = None, errorFuncOutput: Option[ErrorFuncOutput] = None, regularization: Option[Regularization] = None, waypoint: Option[Waypoint] = None, approximation: Option[Approximation] = None, partitions: Option[Set[Int]] = None, specifics: Option[Map[String, Double]] = None) extends Serializable with Product

    The verbose flag indicates logging behavior.

    The verbose flag indicates logging behavior. The learningRate is a function from current iteration and learning rate, producing a new learning rate. The updateRule defines the relationship between gradient, weights and learning rate during training. The network will terminate either if precision is high enough or iterations is reached. If prettyPrint is true, the layout will be rendered graphically. The level of parallelism controls how many threads will be used for training. For distributed training, coordinator and transport specific settings may be configured. The batchSize controls how many samples are presented per weight update. (1=on-line, ..., n=full-batch) The errorFuncOutput option prints the error func graph to the specified file/closure. When regularization is provided, the respective regulator will try to avoid over-fitting. A waypoint action can be specified, e.g. saving the weights along the way. With approximation the gradients will be approximated numerically. With partitions a sequential training sequence can be partitioned for RNNs (0 index-based). Some nets use specific parameters set in the specifics map.

  34. trait StartsWith [L <: HList, +Predicate] extends AnyRef

    Type-class witnessing that the first item within HList L is Predicate.

    Type-class witnessing that the first item within HList L is Predicate.

    Annotations
    @implicitNotFound( ... )
  35. case class Transport (messageGroupSize: Int, frameSize: String) extends Product with Serializable

    The messageGroupSize controls how many weights per batch will be sent.

    The messageGroupSize controls how many weights per batch will be sent. The frameSize is the maximum message size for inter-node communication.

  36. trait TypeAliases extends AnyRef

    For the sake of beauty.

  37. trait Update extends (core.Network.Matrix, core.Network.Matrix, Double, Int) ⇒ Unit

    Since

    09.09.17

  38. case class Waypoint (nth: Int, action: (Network.Weights) ⇒ Unit) extends Product with Serializable

    Performs action on every nth iteration during training.

  39. trait WaypointLogic extends AnyRef
  40. trait WeightProvider extends (Seq[Layer]) ⇒ core.Network.Weights

    A WeightProvider connects the neurons of a Layer through the weights, or synapses.

    A WeightProvider connects the neurons of a Layer through the weights, or synapses.

    Annotations
    @implicitNotFound( ... )
  41. trait Welcoming extends AnyRef

    Since

    09.07.16

Value Members

  1. object Activator extends Serializable

    Activator functions.

  2. object CNN extends BaseOps
  3. object Convolution extends Serializable
  4. object EarlyStoppingLogic
  5. object EndsWith
  6. object FFN extends BaseOps
  7. object IllusionBreaker
  8. object KeepBest extends Regularization with Product with Serializable

    The KeepBest regularization strategy takes weights, which led to the least error during training.

    The KeepBest regularization strategy takes weights, which led to the least error during training. In particular, this is useful for nets whose gradients oscillate heavily during training.

  9. object Network extends TypeAliases with Serializable

    Since

    03.01.16

  10. object RNN extends BaseOps
  11. object StartsWith
  12. object Vanilla extends Update with Product with Serializable

Ungrouped