Class

ml.combust.mleap.core.ann

SoftmaxLayerModelWithCrossEntropyLoss

Related Doc: package ann

Permalink

class SoftmaxLayerModelWithCrossEntropyLoss extends LayerModel with LossFunction

Linear Supertypes
LossFunction, LayerModel, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SoftmaxLayerModelWithCrossEntropyLoss
  2. LossFunction
  3. LayerModel
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SoftmaxLayerModelWithCrossEntropyLoss()

    Permalink

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def computePrevDelta(nextDelta: DenseMatrix[Double], input: DenseMatrix[Double], delta: DenseMatrix[Double]): Unit

    Permalink

    Computes the delta for back propagation.

    Computes the delta for back propagation. Delta is allocated based on the size provided by the LayerModel implementation and the stack (batch) size. Developer is responsible for checking the size of prevDelta when writing to it.

    delta

    delta of this layer

    Definition Classes
    SoftmaxLayerModelWithCrossEntropyLossLayerModel
  7. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  8. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  9. def eval(data: DenseMatrix[Double], output: DenseMatrix[Double]): Unit

    Permalink

    Evaluates the data (process the data through the layer).

    Evaluates the data (process the data through the layer). Output is allocated based on the size provided by the LayerModel implementation and the stack (batch) size. Developer is responsible for checking the size of output when writing to it.

    data

    data

    output

    output (modified in place)

    Definition Classes
    SoftmaxLayerModelWithCrossEntropyLossLayerModel
  10. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  11. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  12. def grad(delta: DenseMatrix[Double], input: DenseMatrix[Double], cumGrad: DenseVector[Double]): Unit

    Permalink

    Computes the gradient.

    Computes the gradient. cumGrad is a wrapper on the part of the weight vector. Size of cumGrad is based on weightSize provided by implementation of LayerModel.

    delta

    delta for this layer

    input

    input data

    cumGrad

    cumulative gradient (modified in place)

    Definition Classes
    SoftmaxLayerModelWithCrossEntropyLossLayerModel
  13. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  14. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  15. def loss(output: DenseMatrix[Double], target: DenseMatrix[Double], delta: DenseMatrix[Double]): Double

    Permalink

    Returns the value of loss function.

    Returns the value of loss function. Computes loss based on target and output. Writes delta (error) to delta in place. Delta is allocated based on the outputSize of model implementation.

    output

    actual output

    target

    target output

    delta

    delta (updated in place)

    returns

    loss

    Definition Classes
    SoftmaxLayerModelWithCrossEntropyLossLossFunction
  16. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  20. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  21. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. val weights: DenseVector[Double]

    Permalink

Inherited from LossFunction

Inherited from LayerModel

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped