Class LossBinaryXENT
- java.lang.Object
-
- org.nd4j.linalg.lossfunctions.impl.LossBinaryXENT
-
- All Implemented Interfaces:
Serializable
,ILossFunction
public class LossBinaryXENT extends Object implements ILossFunction
- See Also:
- Serialized Form
-
-
Field Summary
Fields Modifier and Type Field Description static double
DEFAULT_CLIPPING_EPSILON
-
Constructor Summary
Constructors Constructor Description LossBinaryXENT()
LossBinaryXENT(double clipEps)
Binary cross entropy where each the output is (optionally) weighted/scaled by a fixed scalar value.LossBinaryXENT(double clipEps, INDArray weights)
Binary cross entropy where each the output is (optionally) weighted/scaled by a fixed scalar value.LossBinaryXENT(INDArray weights)
Binary cross entropy where each the output is (optionally) weighted/scaled by a fixed scalar value.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description INDArray
computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the gradient of the loss function with respect to the inputs: dL/dOutputPair<Double,INDArray>
computeGradientAndScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute both the score (loss function value) and gradient.double
computeScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Compute the score (loss function value) for the given inputs.INDArray
computeScoreArray(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Compute the score (loss function value) for each example individually.String
name()
The opName of this functionString
toString()
-
-
-
Field Detail
-
DEFAULT_CLIPPING_EPSILON
public static final double DEFAULT_CLIPPING_EPSILON
- See Also:
- Constant Field Values
-
-
Constructor Detail
-
LossBinaryXENT
public LossBinaryXENT()
-
LossBinaryXENT
public LossBinaryXENT(INDArray weights)
Binary cross entropy where each the output is (optionally) weighted/scaled by a fixed scalar value. Note that the weights array must be a row vector, of length equal to the labels/output dimension 1 size. A weight vector of 1s should give identical results to no weight vector.- Parameters:
weights
- Weights array (row vector). May be null.
-
LossBinaryXENT
public LossBinaryXENT(double clipEps)
Binary cross entropy where each the output is (optionally) weighted/scaled by a fixed scalar value. Note that the weights array must be a row vector, of length equal to the labels/output dimension 1 size. A weight vector of 1s should give identical results to no weight vector.- Parameters:
clipEps
- Epsilon value for clipping. Probabilities are clipped in range of [eps, 1-eps]. Default eps: 1e-5
-
LossBinaryXENT
public LossBinaryXENT(double clipEps, INDArray weights)
Binary cross entropy where each the output is (optionally) weighted/scaled by a fixed scalar value. Note that the weights array must be a row vector, of length equal to the labels/output dimension 1 size. A weight vector of 1s should give identical results to no weight vector.- Parameters:
clipEps
- Epsilon value for clipping. Probabilities are clipped in range of [eps, 1-eps]. Default eps: 1e-5weights
- Weights array (row vector). May be null.
-
-
Method Detail
-
computeScore
public double computeScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Description copied from interface:ILossFunction
Compute the score (loss function value) for the given inputs.- Specified by:
computeScore
in interfaceILossFunction
- Parameters:
labels
- Label/expected preOutputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be nullaverage
- Whether the score should be averaged (divided by number of rows in labels/preOutput) or not @return Loss function value
-
computeScoreArray
public INDArray computeScoreArray(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Description copied from interface:ILossFunction
Compute the score (loss function value) for each example individually. For input [numExamples,nOut] returns scores as a column vector: [numExamples,1]- Specified by:
computeScoreArray
in interfaceILossFunction
- Parameters:
labels
- Labels/expected outputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutput- Returns:
- Loss function value for each example; column vector
-
computeGradient
public INDArray computeGradient(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask)
Description copied from interface:ILossFunction
Compute the gradient of the loss function with respect to the inputs: dL/dOutput- Specified by:
computeGradient
in interfaceILossFunction
- Parameters:
labels
- Label/expected outputpreOutput
- Output of the model (neural network), before the activation function is appliedactivationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be null- Returns:
- Gradient dL/dPreOut
-
computeGradientAndScore
public Pair<Double,INDArray> computeGradientAndScore(INDArray labels, INDArray preOutput, IActivation activationFn, INDArray mask, boolean average)
Description copied from interface:ILossFunction
Compute both the score (loss function value) and gradient. This is equivalent to callingILossFunction.computeScore(INDArray, INDArray, IActivation, INDArray, boolean)
andILossFunction.computeGradient(INDArray, INDArray, IActivation, INDArray)
individually- Specified by:
computeGradientAndScore
in interfaceILossFunction
- Parameters:
labels
- Label/expected outputpreOutput
- Output of the model (neural network)activationFn
- Activation function that should be applied to preOutputmask
- Mask array; may be nullaverage
- Whether the score should be averaged (divided by number of rows in labels/output) or not- Returns:
- The score (loss function value) and gradient
-
name
public String name()
The opName of this function- Specified by:
name
in interfaceILossFunction
- Returns:
-
-