package core
- Alphabetic
- Public
- All
Type Members
-
trait
Activator
[N] extends (N) ⇒ N with UFunc with MappingUFunc with Serializable
The activator function with its derivative.
-
case class
Approximation
(Δ: Double) extends Serializable with Product
Approximates with precision
Δ
. - trait BaseOps extends AnyRef
-
trait
Constructor
[+T <: Network[_, _]] extends AnyRef
A minimal constructor for a Network.
A minimal constructor for a Network.
- Annotations
- @implicitNotFound( ... )
-
case class
Convolution
(dimIn: (Int, Int, Int), field: (Int, Int), filters: Int, stride: Int, activator: Activator[Double]) extends Hidden with HasActivator[Double] with Layer with In with Product with Serializable
Convolutes the input volume.
Convolutes the input volume. Where: Input dimension
dimIn
as (width, height, depth). The receptivefield
as (width, height). How many independentfilters
are attached to the input. Sliding over the input volume using astride
. Applying theactivator
function element-wise on the output. - trait ConvolutionalNetwork extends Network[core.Network.Matrices, core.Network.Vector]
- case class Debuggable () extends Update with Product with Serializable
-
case class
Dense
(neurons: Int, activator: Activator[Double]) extends Layer with Hidden with HasActivator[Double] with Product with Serializable
Dense layer carrying
neurons
withactivator
function. - trait DistributedConvolutionalNetwork extends Network[core.Network.Matrices, core.Network.Vector] with DistributedTraining
- trait DistributedFeedForwardNetwork extends Network[core.Network.Vector, core.Network.Vector] with DistributedTraining
- trait DistributedTraining extends AnyRef
-
case class
EarlyStopping
[In, Out](xs: Seq[In], ys: Seq[Out], factor: Double) extends Regularization with Product with Serializable
Continuously computes the average error for given test input and output
xs
,ys
.Continuously computes the average error for given test input and output
xs
,ys
. If the error moves too far away from the best result, measured in terms of a distancefactor
, the training process will stop early to avoid over-training. - trait EarlyStoppingLogic extends AnyRef
-
trait
EndsWith
[L <: HList, +Predicate] extends AnyRef
Type-class witnessing that the last item within HList
L
isPredicate
.Type-class witnessing that the last item within HList
L
isPredicate
.- Annotations
- @implicitNotFound( ... )
-
trait
ErrorFuncGrapher
extends AnyRef
- Since
10.07.16
- case class ErrorFuncOutput (file: Option[String] = None, action: Option[(Double) ⇒ Unit] = None) extends Product with Serializable
- trait FeedForwardNetwork extends Network[core.Network.Vector, core.Network.Vector]
-
case class
Focus
(inner: Layer with HasActivator[Double]) extends Layer with Product with Serializable
Focus is used if the desired model output is not the Out layer, but a hidden one.
-
trait
HasActivator
[N] extends AnyRef
Label for neurons in the network performing a function on their synapses.
- sealed trait Hidden extends AnyRef
- trait IllusionBreaker extends AnyRef
- sealed trait In extends AnyRef
-
case class
Input
(neurons: Int) extends Layer with In with Product with Serializable
Dense input layer carrying
neurons
. - trait KeepBestLogic extends AnyRef
-
sealed
trait
Layer
extends Serializable
Base-label for all layers.
- case class Momentum (μ: Double = 0.9) extends Update with Product with Serializable
- trait Network [In, Out] extends (In) ⇒ Out with Logs with ErrorFuncGrapher with IllusionBreaker with Welcoming with Serializable
-
case class
Node
(host: String, port: Int) extends Product with Serializable
Distributed training node
- sealed trait Out extends AnyRef
-
case class
Output
(neurons: Int, activator: Activator[Double]) extends Layer with HasActivator[Double] with Out with Product with Serializable
Dense output layer carrying
neurons
withactivator
function. - trait RecurrentNetwork extends Network[core.Network.Vectors, core.Network.Vectors]
-
trait
Regularization
extends Serializable
Marker trait for regulators
-
case class
Settings
(verbose: Boolean = true, learningRate: Network.Learning = { case (_, _) => 1E-4 }, updateRule: Update = Vanilla, precision: Double = 1E-5, iterations: Int = 100, prettyPrint: Boolean = false, parallelism: Int = ..., coordinator: Node = Node("0.0.0.0", 2552), transport: Transport = Transport(100000, "128 MiB"), batchSize: Option[Int] = None, errorFuncOutput: Option[ErrorFuncOutput] = None, regularization: Option[Regularization] = None, waypoint: Option[Waypoint] = None, approximation: Option[Approximation] = None, partitions: Option[Set[Int]] = None, specifics: Option[Map[String, Double]] = None) extends Serializable with Product
The
verbose
flag indicates logging behavior.The
verbose
flag indicates logging behavior. ThelearningRate
is a function from current iteration and learning rate, producing a new learning rate. TheupdateRule
defines the relationship between gradient, weights and learning rate during training. The network will terminate either ifprecision
is high enough oriterations
is reached. IfprettyPrint
is true, the layout will be rendered graphically. The level ofparallelism
controls how many threads will be used for training. For distributed training,coordinator
andtransport
specific settings may be configured. ThebatchSize
controls how many samples are presented per weight update. (1=on-line, ..., n=full-batch) TheerrorFuncOutput
option prints the error func graph to the specified file/closure. Whenregularization
is provided, the respective regulator will try to avoid over-fitting. Awaypoint
action can be specified, e.g. saving the weights along the way. Withapproximation
the gradients will be approximated numerically. Withpartitions
a sequential training sequence can be partitioned for RNNs (0 index-based). Some nets use specific parameters set in thespecifics
map. -
trait
StartsWith
[L <: HList, +Predicate] extends AnyRef
Type-class witnessing that the first item within HList
L
isPredicate
.Type-class witnessing that the first item within HList
L
isPredicate
.- Annotations
- @implicitNotFound( ... )
-
case class
Transport
(messageGroupSize: Int, frameSize: String) extends Product with Serializable
The
messageGroupSize
controls how many weights per batch will be sent.The
messageGroupSize
controls how many weights per batch will be sent. TheframeSize
is the maximum message size for inter-node communication. -
trait
TypeAliases
extends AnyRef
For the sake of beauty.
-
trait
Update
extends (core.Network.Matrix, core.Network.Matrix, Double, Int) ⇒ Unit
- Since
09.09.17
-
case class
Waypoint
(nth: Int, action: (Network.Weights) ⇒ Unit) extends Product with Serializable
Performs
action
on everynth
iteration during training. - trait WaypointLogic extends AnyRef
-
trait
WeightProvider
extends (Seq[Layer]) ⇒ core.Network.Weights
A WeightProvider connects the neurons of a Layer through the weights, or synapses.
A WeightProvider connects the neurons of a Layer through the weights, or synapses.
- Annotations
- @implicitNotFound( ... )
-
trait
Welcoming
extends AnyRef
- Since
09.07.16
Value Members
-
object
Activator
extends Serializable
Activator functions.
- object CNN extends BaseOps
- object Convolution extends Serializable
- object EarlyStoppingLogic
- object EndsWith
- object FFN extends BaseOps
- object IllusionBreaker
-
object
KeepBest
extends Regularization with Product with Serializable
The KeepBest regularization strategy takes weights, which led to the least error during training.
The KeepBest regularization strategy takes weights, which led to the least error during training. In particular, this is useful for nets whose gradients oscillate heavily during training.
-
object
Network
extends TypeAliases with Serializable
- Since
03.01.16
- object RNN extends BaseOps
- object StartsWith
- object Vanilla extends Update with Product with Serializable