Class/Object

io.github.mandar2812.dynaml.optimization

GradientDescent

Related Docs: object GradientDescent | package optimization

Permalink

class GradientDescent extends RegularizedOptimizer[DenseVector[Double], DenseVector[Double], Double, Stream[(DenseVector[Double], Double)]]

Implements Gradient Descent on the graph generated to calculate approximate optimal values of the model parameters.

Linear Supertypes
RegularizedOptimizer[DenseVector[Double], DenseVector[Double], Double, Stream[(DenseVector[Double], Double)]], Optimizer[DenseVector[Double], DenseVector[Double], Double, Stream[(DenseVector[Double], Double)]], Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. GradientDescent
  2. RegularizedOptimizer
  3. Optimizer
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new GradientDescent(gradient: Gradient, updater: Updater)

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  10. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  12. var miniBatchFraction: Double

    Permalink
    Attributes
    protected
    Definition Classes
    RegularizedOptimizer
  13. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  15. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. var numIterations: Int

    Permalink
    Attributes
    protected
    Definition Classes
    RegularizedOptimizer
  17. def optimize(nPoints: Long, ParamOutEdges: Stream[(DenseVector[Double], Double)], initialP: DenseVector[Double]): DenseVector[Double]

    Permalink

    Find the optimum value of the parameters using Gradient Descent.

    Find the optimum value of the parameters using Gradient Descent.

    nPoints

    The number of data points

    ParamOutEdges

    An java.lang.Iterable object having all of the out edges of the parameter node

    initialP

    The initial value of the parameters as a DenseVector

    returns

    The value of the parameters as a DenseVector

    Definition Classes
    GradientDescentOptimizer
  18. var regParam: Double

    Permalink
    Attributes
    protected
    Definition Classes
    RegularizedOptimizer
  19. def setGradient(gradient: Gradient): GradientDescent.this.type

    Permalink

    Set the gradient function (of the loss function of one single data example) to be used for SGD.

  20. def setMiniBatchFraction(fraction: Double): GradientDescent.this.type

    Permalink

    Set fraction of data to be used for each SGD iteration.

    Set fraction of data to be used for each SGD iteration. Default 1.0 (corresponding to deterministic/classical gradient descent)

    Definition Classes
    RegularizedOptimizer
  21. def setNumIterations(iters: Int): GradientDescent.this.type

    Permalink

    Set the number of iterations for SGD.

    Set the number of iterations for SGD. Default 100.

    Definition Classes
    RegularizedOptimizer
  22. def setRegParam(regParam: Double): GradientDescent.this.type

    Permalink

    Set the regularization parameter.

    Set the regularization parameter. Default 0.0.

    Definition Classes
    RegularizedOptimizer
  23. def setStepSize(step: Double): GradientDescent.this.type

    Permalink

    Set the initial step size of SGD for the first step.

    Set the initial step size of SGD for the first step. Default 1.0. In subsequent steps, the step size will decrease with stepSize/sqrt(t)

    Definition Classes
    RegularizedOptimizer
  24. def setUpdater(updater: Updater): GradientDescent.this.type

    Permalink

    Set the updater function to actually perform a gradient step in a given direction.

    Set the updater function to actually perform a gradient step in a given direction. The updater is responsible to perform the update from the regularization term as well, and therefore determines what kind or regularization is used, if any.

  25. var stepSize: Double

    Permalink
    Attributes
    protected
    Definition Classes
    RegularizedOptimizer
  26. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  27. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  28. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  29. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from RegularizedOptimizer[DenseVector[Double], DenseVector[Double], Double, Stream[(DenseVector[Double], Double)]]

Inherited from Optimizer[DenseVector[Double], DenseVector[Double], Double, Stream[(DenseVector[Double], Double)]]

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped