public class CenterLossOutputLayer extends BaseOutputLayer<CenterLossOutputLayer>
Layer.TrainingMode, Layer.Type
inputMaskArray, inputMaskArrayState, labels
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, weightNoiseParams
cacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
CenterLossOutputLayer(NeuralNetConfiguration conf) |
CenterLossOutputLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
void |
computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Update the score
|
double |
computeScore(double fullNetworkL1,
double fullNetworkL2,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.
|
org.nd4j.linalg.api.ndarray.INDArray |
computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
protected org.nd4j.linalg.api.ndarray.INDArray |
getLabels2d(LayerWorkspaceMgr workspaceMgr,
ArrayType arrayType) |
Gradient |
gradient()
Gets the gradient from one training iteration
|
org.nd4j.linalg.primitives.Pair<Gradient,Double> |
gradientAndScore()
Get the gradient and score
|
protected void |
setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z) |
activate, applyMask, clear, f1Score, f1Score, fit, fit, fit, fit, fit, getLabels, hasBias, isPretrainLayer, labelProbabilities, needsLabels, numLabels, predict, predict, preOutput2d, setLabels
accumulateScore, activate, calcL1, calcL2, clearNoiseWeightParams, clone, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, initParams, layerConf, numParams, params, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, transpose, update, update
addListeners, applyConstraints, applyDropOutIfNecessary, assertInputSet, batchSize, conf, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, init, input, layerId, numParams, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
activate, calcL1, calcL2, clearNoiseWeightParams, clone, feedForwardMaskArray, getEpochCount, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArray, transpose, type
accumulateScore, addListeners, applyConstraints, batchSize, conf, fit, getGradientsViewArray, getOptimizer, getParam, init, initParams, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInput
public CenterLossOutputLayer(NeuralNetConfiguration conf)
public CenterLossOutputLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public double computeScore(double fullNetworkL1, double fullNetworkL2, boolean training, LayerWorkspaceMgr workspaceMgr)
computeScore
in interface IOutputLayer
computeScore
in class BaseOutputLayer<CenterLossOutputLayer>
fullNetworkL1
- L1 regularization term for the entire networkfullNetworkL2
- L2 regularization term for the entire networktraining
- whether score should be calculated at train or test time (this affects things like application of
dropout, etc)public org.nd4j.linalg.api.ndarray.INDArray computeScoreForExamples(double fullNetworkL1, double fullNetworkL2, LayerWorkspaceMgr workspaceMgr)
computeScoreForExamples
in interface IOutputLayer
computeScoreForExamples
in class BaseOutputLayer<CenterLossOutputLayer>
fullNetworkL1
- L1 regularization term for the entire network (or, 0.0 to not include regularization)fullNetworkL2
- L2 regularization term for the entire network (or, 0.0 to not include regularization)public void computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Model
computeGradientAndScore
in interface Model
computeGradientAndScore
in class BaseOutputLayer<CenterLossOutputLayer>
protected void setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z)
setScoreWithZ
in class BaseOutputLayer<CenterLossOutputLayer>
public org.nd4j.linalg.primitives.Pair<Gradient,Double> gradientAndScore()
Model
gradientAndScore
in interface Model
gradientAndScore
in class BaseOutputLayer<CenterLossOutputLayer>
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseOutputLayer<CenterLossOutputLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic Gradient gradient()
gradient
in interface Model
gradient
in class BaseOutputLayer<CenterLossOutputLayer>
protected org.nd4j.linalg.api.ndarray.INDArray getLabels2d(LayerWorkspaceMgr workspaceMgr, ArrayType arrayType)
getLabels2d
in class BaseOutputLayer<CenterLossOutputLayer>
Copyright © 2018. All rights reserved.