Package org.nd4j.linalg.lossfunctions
Class SameDiffLoss
- java.lang.Object
-
- org.nd4j.linalg.lossfunctions.SameDiffLoss
-
- All Implemented Interfaces:
Serializable
,ILossFunction
public abstract class SameDiffLoss extends Object implements ILossFunction
- See Also:
- Serialized Form
-
-
Field Summary
Fields Modifier and Type Field Description protected SDVariable
scorePerExampleVariable
protected SameDiff
sd
-
Constructor Summary
Constructors Modifier Constructor Description protected
SameDiffLoss()
-
Method Summary
All Methods Instance Methods Abstract Methods Concrete Methods Modifier and Type Method Description INDArray
computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the gradient of the loss function with respect to the inputs: dL/dOutputPair<Double,INDArray>
computeGradientAndScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute both the score (loss function value) and gradient.double
computeScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute the score (loss function value) for the given inputs.INDArray
computeScoreArray(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the score (loss function value) for each example individually.protected void
createSameDiffInstance(DataType dataType)
abstract SDVariable
defineLoss(SameDiff sd, SDVariable layerInput, SDVariable labels)
Define the loss function.
NOTE: The score on a *per example* basis - should return a SDVariable with shape [minibatch], where out[i] is the score for the ith minibatchString
name()
The opName of this function
-
-
-
Field Detail
-
sd
protected transient SameDiff sd
-
scorePerExampleVariable
protected transient SDVariable scorePerExampleVariable
-
-
Method Detail
-
defineLoss
public abstract SDVariable defineLoss(SameDiff sd, SDVariable layerInput, SDVariable labels)
Define the loss function.
NOTE: The score on a *per example* basis - should return a SDVariable with shape [minibatch], where out[i] is the score for the ith minibatch- Parameters:
sd
- SameDiff instance to define the loss onlayerInput
- Input to the SameDiff loss functionlabels
- Labels placeholder- Returns:
- The score on a per example basis (SDVariable with shape [minibatch])
-
createSameDiffInstance
protected void createSameDiffInstance(DataType dataType)
-
computeScore
public double computeScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute the score (loss function value) for the given inputs.- Specified by:
computeScore
in interfaceILossFunction
- Parameters:
labels
- Label/expected preOutputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be nullaverage
- Whether the score should be averaged (divided by number of rows in labels/preOutput) or not @return Loss function value
-
computeScoreArray
public INDArray computeScoreArray(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the score (loss function value) for each example individually. For input [numExamples,nOut] returns scores as a column vector: [numExamples,1]- Specified by:
computeScoreArray
in interfaceILossFunction
- Parameters:
labels
- Labels/expected outputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutputmask
-- Returns:
- Loss function value for each example; column vector
-
computeGradient
public INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the gradient of the loss function with respect to the inputs: dL/dOutput- Specified by:
computeGradient
in interfaceILossFunction
- Parameters:
labels
- Label/expected outputpreOutput
- Output of the model (neural network), before the activation function is appliedactivationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be null- Returns:
- Gradient dL/dPreOut
-
computeGradientAndScore
public Pair<Double,INDArray> computeGradientAndScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute both the score (loss function value) and gradient. This is equivalent to callingcomputeScore(INDArray, INDArray, IActivation, INDArray, boolean)
andcomputeGradient(INDArray, INDArray, IActivation, INDArray)
individually- Specified by:
computeGradientAndScore
in interfaceILossFunction
- Parameters:
labels
- Label/expected outputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be nullaverage
- Whether the score should be averaged (divided by number of rows in labels/output) or not- Returns:
- The score (loss function value) and gradient
-
name
public String name()
Description copied from interface:ILossFunction
The opName of this function- Specified by:
name
in interfaceILossFunction
- Returns:
-
-