Class

breeze.optimize.FirstOrderMinimizer

OptParams

Related Doc: package FirstOrderMinimizer

Permalink

case class OptParams(batchSize: Int = 512, regularization: Double = 0.0, alpha: Double = 0.5, maxIterations: Int = 1000, useL1: Boolean = false, tolerance: Double = 1E-5, useStochastic: Boolean = false, randomSeed: Int = 0) extends Product with Serializable

OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.

Configurations: 1) useStochastic=false,useL1=false: LBFGS with L2 regularization 2) useStochastic=false,useL1=true: OWLQN with L1 regularization 3) useStochastic=true,useL1=false: AdaptiveGradientDescent with L2 regularization 3) useStochastic=true,useL1=true: AdaptiveGradientDescent with L1 regularization

batchSize

size of batches to use if useStochastic and you give a BatchDiffFunction

regularization

regularization constant to use.

alpha

rate of change to use, only applies to SGD.

useL1

if true, use L1 regularization. Otherwise, use L2.

tolerance

convergence tolerance, looking at both average improvement and the norm of the gradient.

useStochastic

if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. OptParams
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new OptParams(batchSize: Int = 512, regularization: Double = 0.0, alpha: Double = 0.5, maxIterations: Int = 1000, useL1: Boolean = false, tolerance: Double = 1E-5, useStochastic: Boolean = false, randomSeed: Int = 0)

    Permalink

    batchSize

    size of batches to use if useStochastic and you give a BatchDiffFunction

    regularization

    regularization constant to use.

    alpha

    rate of change to use, only applies to SGD.

    useL1

    if true, use L1 regularization. Otherwise, use L2.

    tolerance

    convergence tolerance, looking at both average improvement and the norm of the gradient.

    useStochastic

    if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. val alpha: Double

    Permalink

    rate of change to use, only applies to SGD.

  5. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  6. val batchSize: Int

    Permalink

    size of batches to use if useStochastic and you give a BatchDiffFunction

  7. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  9. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  11. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  12. val maxIterations: Int

    Permalink
  13. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  15. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. val randomSeed: Int

    Permalink
  17. val regularization: Double

    Permalink

    regularization constant to use.

  18. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  19. val tolerance: Double

    Permalink

    convergence tolerance, looking at both average improvement and the norm of the gradient.

  20. val useL1: Boolean

    Permalink

    if true, use L1 regularization.

    if true, use L1 regularization. Otherwise, use L2.

  21. val useStochastic: Boolean

    Permalink

    if false, use LBFGS or OWLQN.

    if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.

  22. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def iterations[T, K](f: DiffFunction[T], init: T)(implicit space: MutableEnumeratedCoordinateField[T, K, Double]): Iterator[State[T, Info, ApproximateInverseHessian[T]] forSome {val _948: LBFGS[T]}]

    Permalink
    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.iterations(f, init, params) instead.

  2. def iterations[T](f: StochasticDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): Iterator[State[T, Info, FirstOrderMinimizer._949.type.History] forSome {val _949: FirstOrderMinimizer[T, StochasticDiffFunction[T]]}]

    Permalink
    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.iterations(f, init, params) instead.

  3. def iterations[T](f: BatchDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): Iterator[State[T, Info, FirstOrderMinimizer._950.type.History] forSome {val _950: FirstOrderMinimizer[T, BatchDiffFunction[T]]}]

    Permalink
    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.iterations(f, init, params) instead.

  4. def minimize[T](f: DiffFunction[T], init: T)(implicit space: MutableEnumeratedCoordinateField[T, _, Double]): T

    Permalink
    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.minimize(f, init, params) instead.

  5. def minimize[T](f: BatchDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): T

    Permalink
    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.minimize(f, init, params) instead.

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped