Class/Object

io.github.mandar2812.dynaml.optimization

BackPropagation

Related Docs: object BackPropagation | package optimization

Permalink

class BackPropagation extends RegularizedOptimizer[FFNeuralGraph, DenseVector[Double], DenseVector[Double], Stream[(DenseVector[Double], DenseVector[Double])]]

Implementation of the standard back pro-pogation with momentum using the "generalized delta rule".

Linear Supertypes
RegularizedOptimizer[FFNeuralGraph, DenseVector[Double], DenseVector[Double], Stream[(DenseVector[Double], DenseVector[Double])]], Optimizer[FFNeuralGraph, DenseVector[Double], DenseVector[Double], Stream[(DenseVector[Double], DenseVector[Double])]], Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. BackPropagation
  2. RegularizedOptimizer
  3. Optimizer
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new BackPropagation()

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  12. var miniBatchFraction: Double

    Permalink
    Attributes
    protected
    Definition Classes
    RegularizedOptimizer
  13. var momentum: Double

    Permalink
    Attributes
    protected
  14. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  15. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  17. var numIterations: Int

    Permalink
    Attributes
    protected
    Definition Classes
    RegularizedOptimizer
  18. def optimize(nPoints: Long, ParamOutEdges: Stream[(DenseVector[Double], DenseVector[Double])], initialP: FFNeuralGraph): FFNeuralGraph

    Permalink

    Solve the convex optimization problem.

    Solve the convex optimization problem.

    Definition Classes
    BackPropagationOptimizer
  19. var regParam: Double

    Permalink
    Attributes
    protected
    Definition Classes
    RegularizedOptimizer
  20. def setMiniBatchFraction(fraction: Double): BackPropagation.this.type

    Permalink

    Set fraction of data to be used for each SGD iteration.

    Set fraction of data to be used for each SGD iteration. Default 1.0 (corresponding to deterministic/classical gradient descent)

    Definition Classes
    RegularizedOptimizer
  21. def setMomentum(m: Double): BackPropagation.this.type

    Permalink
  22. def setNumIterations(iters: Int): BackPropagation.this.type

    Permalink

    Set the number of iterations for SGD.

    Set the number of iterations for SGD. Default 100.

    Definition Classes
    RegularizedOptimizer
  23. def setRegParam(regParam: Double): BackPropagation.this.type

    Permalink

    Set the regularization parameter.

    Set the regularization parameter. Default 0.0.

    Definition Classes
    RegularizedOptimizer
  24. def setSparsityWeight(s: Double): BackPropagation.this.type

    Permalink
  25. def setStepSize(step: Double): BackPropagation.this.type

    Permalink

    Set the initial step size of SGD for the first step.

    Set the initial step size of SGD for the first step. Default 1.0. In subsequent steps, the step size will decrease with stepSize/sqrt(t)

    Definition Classes
    RegularizedOptimizer
  26. var sparsityWeight: Double

    Permalink
    Attributes
    protected
  27. var stepSize: Double

    Permalink
    Attributes
    protected
    Definition Classes
    RegularizedOptimizer
  28. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  29. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  30. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from RegularizedOptimizer[FFNeuralGraph, DenseVector[Double], DenseVector[Double], Stream[(DenseVector[Double], DenseVector[Double])]]

Inherited from Optimizer[FFNeuralGraph, DenseVector[Double], DenseVector[Double], Stream[(DenseVector[Double], DenseVector[Double])]]

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped