Package org.nd4j.linalg.lossfunctions
Interface ILossFunction
-
- All Superinterfaces:
Serializable
- All Known Implementing Classes:
LossBinaryXENT
,LossCosineProximity
,LossFMeasure
,LossHinge
,LossKLD
,LossL1
,LossL2
,LossMAE
,LossMAPE
,LossMCXENT
,LossMixtureDensity
,LossMSE
,LossMSLE
,LossMultiLabel
,LossNegativeLogLikelihood
,LossPoisson
,LossSparseMCXENT
,LossSquaredHinge
,LossWasserstein
,SameDiffLoss
public interface ILossFunction extends Serializable
-
-
Method Summary
All Methods Instance Methods Abstract Methods Modifier and Type Method Description INDArray
computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the gradient of the loss function with respect to the inputs: dL/dOutputPair<Double,INDArray>
computeGradientAndScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute both the score (loss function value) and gradient.double
computeScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute the score (loss function value) for the given inputs.INDArray
computeScoreArray(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the score (loss function value) for each example individually.String
name()
The opName of this function
-
-
-
Method Detail
-
computeScore
double computeScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute the score (loss function value) for the given inputs.- Parameters:
labels
- Label/expected preOutputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be nullaverage
- Whether the score should be averaged (divided by number of rows in labels/preOutput) or not @return Loss function value
-
computeScoreArray
INDArray computeScoreArray(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the score (loss function value) for each example individually. For input [numExamples,nOut] returns scores as a column vector: [numExamples,1]- Parameters:
labels
- Labels/expected outputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutputmask
-- Returns:
- Loss function value for each example; column vector
-
computeGradient
INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the gradient of the loss function with respect to the inputs: dL/dOutput- Parameters:
labels
- Label/expected outputpreOutput
- Output of the model (neural network), before the activation function is appliedactivationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be null- Returns:
- Gradient dL/dPreOut
-
computeGradientAndScore
Pair<Double,INDArray> computeGradientAndScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute both the score (loss function value) and gradient. This is equivalent to callingcomputeScore(INDArray, INDArray, IActivation, INDArray, boolean)
andcomputeGradient(INDArray, INDArray, IActivation, INDArray)
individually- Parameters:
labels
- Label/expected outputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be nullaverage
- Whether the score should be averaged (divided by number of rows in labels/output) or not- Returns:
- The score (loss function value) and gradient
-
name
String name()
The opName of this function- Returns:
-
-