breeze.optimize.FirstOrderMinimizer

OptParams

case class OptParams(batchSize: Int = 512, regularization: Double = 0.0, alpha: Double = 0.5, maxIterations: Int = 1000, useL1: Boolean = false, tolerance: Double = 1.0E-5, useStochastic: Boolean = false, randomSeed: Int = 0) extends Product with Serializable

OptParams is a Configuration-compatible case class that can be used to select optimization routines at runtime.

Configurations: 1) useStochastic=false,useL1=false: LBFGS with L2 regularization 2) useStochastic=false,useL1=true: OWLQN with L1 regularization 3) useStochastic=true,useL1=false: AdaptiveGradientDescent with L2 regularization 3) useStochastic=true,useL1=true: AdaptiveGradientDescent with L1 regularization

batchSize

size of batches to use if useStochastic and you give a BatchDiffFunction

regularization

regularization constant to use.

alpha

rate of change to use, only applies to SGD.

useL1

if true, use L1 regularization. Otherwise, use L2.

tolerance

convergence tolerance, looking at both average improvement and the norm of the gradient.

useStochastic

if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.

Linear Supertypes
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. OptParams
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new OptParams(batchSize: Int = 512, regularization: Double = 0.0, alpha: Double = 0.5, maxIterations: Int = 1000, useL1: Boolean = false, tolerance: Double = 1.0E-5, useStochastic: Boolean = false, randomSeed: Int = 0)

    batchSize

    size of batches to use if useStochastic and you give a BatchDiffFunction

    regularization

    regularization constant to use.

    alpha

    rate of change to use, only applies to SGD.

    useL1

    if true, use L1 regularization. Otherwise, use L2.

    tolerance

    convergence tolerance, looking at both average improvement and the norm of the gradient.

    useStochastic

    if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.

Value Members

  1. final def !=(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  2. final def !=(arg0: Any): Boolean

    Definition Classes
    Any
  3. final def ##(): Int

    Definition Classes
    AnyRef → Any
  4. final def ==(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  5. final def ==(arg0: Any): Boolean

    Definition Classes
    Any
  6. val alpha: Double

    rate of change to use, only applies to SGD.

  7. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  8. val batchSize: Int

    size of batches to use if useStochastic and you give a BatchDiffFunction

  9. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  10. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  11. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  13. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  14. val maxIterations: Int

  15. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  16. final def notify(): Unit

    Definition Classes
    AnyRef
  17. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  18. val randomSeed: Int

  19. val regularization: Double

    regularization constant to use.

  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  21. val tolerance: Double

    convergence tolerance, looking at both average improvement and the norm of the gradient.

  22. val useL1: Boolean

    if true, use L1 regularization.

    if true, use L1 regularization. Otherwise, use L2.

  23. val useStochastic: Boolean

    if false, use LBFGS or OWLQN.

    if false, use LBFGS or OWLQN. If true, use some variant of Stochastic Gradient Descent.

  24. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  25. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  26. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Deprecated Value Members

  1. def iterations[T, K](f: DiffFunction[T], init: T)(implicit space: MutableEnumeratedCoordinateField[T, K, Double]): Iterator[State[T, Info, ApproximateInverseHessian[T]] forSome {val _948: LBFGS[T]}]

    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.iterations(f, init, params) instead.

  2. def iterations[T](f: StochasticDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): Iterator[State[T, Info, FirstOrderMinimizer._949.type.History] forSome {val _949: FirstOrderMinimizer[T, StochasticDiffFunction[T]]}]

    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.iterations(f, init, params) instead.

  3. def iterations[T](f: BatchDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): Iterator[State[T, Info, FirstOrderMinimizer._950.type.History] forSome {val _950: FirstOrderMinimizer[T, BatchDiffFunction[T]]}]

    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.iterations(f, init, params) instead.

  4. def minimize[T](f: DiffFunction[T], init: T)(implicit space: MutableEnumeratedCoordinateField[T, _, Double]): T

    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.minimize(f, init, params) instead.

  5. def minimize[T](f: BatchDiffFunction[T], init: T)(implicit space: MutableFiniteCoordinateField[T, _, Double]): T

    Annotations
    @deprecated
    Deprecated

    (Since version 0.10) Use breeze.optimize.minimize(f, init, params) instead.

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped