public class LossLayer extends BaseLayer<LossLayer> implements Serializable, IOutputLayer
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected org.nd4j.linalg.api.ndarray.INDArray |
labels |
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, weightNoiseParamscacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
LossLayer(NeuralNetConfiguration conf) |
LossLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the specified input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
clear()
Clear input
|
void |
computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Update the score
|
double |
computeScore(double fullNetworkL1,
double fullNetworkL2,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.
|
org.nd4j.linalg.api.ndarray.INDArray |
computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
double |
f1Score(org.nd4j.linalg.dataset.api.DataSet data)
Sets the input and labels and returns a score for the prediction
wrt true labels
|
double |
f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
Returns the f1 score for the given examples.
|
void |
fit(org.nd4j.linalg.dataset.api.DataSet data)
Fit the model
|
void |
fit(org.nd4j.linalg.dataset.api.iterator.DataSetIterator iter)
Train the model based on the datasetiterator
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray input,
org.nd4j.linalg.api.ndarray.INDArray labels)
Fit the model
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray examples,
int[] labels)
Fit the model
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray input,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
org.nd4j.linalg.api.ndarray.INDArray |
getLabels()
Get the labels array previously set with
IOutputLayer.setLabels(INDArray) |
protected org.nd4j.linalg.api.ndarray.INDArray |
getLabels2d() |
Gradient |
gradient()
Gets the gradient from one training iteration
|
org.nd4j.linalg.primitives.Pair<Gradient,Double> |
gradientAndScore()
Get the gradient and score
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
org.nd4j.linalg.api.ndarray.INDArray |
labelProbabilities(org.nd4j.linalg.api.ndarray.INDArray examples)
Returns the probabilities for each label
for each example row wise
|
boolean |
needsLabels()
Returns true if labels are required
for this output layer
|
int |
numLabels()
Returns the number of possible labels
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
List<String> |
predict(org.nd4j.linalg.dataset.api.DataSet dataSet)
Return predicted label names
|
int[] |
predict(org.nd4j.linalg.api.ndarray.INDArray input)
Returns the predictions for each example in the dataset
|
void |
setLabels(org.nd4j.linalg.api.ndarray.INDArray labels)
Set the labels array for this output layer
|
protected void |
setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z) |
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
accumulateScore, clearNoiseWeightParams, clone, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, hasBias, initParams, layerConf, numParams, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, update, updateaddListeners, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, batchSize, conf, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, init, input, layerId, numParams, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, validateInputequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitclearNoiseWeightParams, clone, feedForwardMaskArray, getEpochCount, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArrayaccumulateScore, addListeners, applyConstraints, batchSize, conf, fit, getGradientsViewArray, getOptimizer, getParam, init, initParams, input, numParams, numParams, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInputpublic LossLayer(NeuralNetConfiguration conf)
public LossLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public double computeScore(double fullNetworkL1,
double fullNetworkL2,
boolean training,
LayerWorkspaceMgr workspaceMgr)
computeScore in interface IOutputLayerfullNetworkL1 - L1 regularization term for the entire networkfullNetworkL2 - L2 regularization term for the entire networktraining - whether score should be calculated at train or test time (this affects things like application of
dropout, etc)public org.nd4j.linalg.api.ndarray.INDArray computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2,
LayerWorkspaceMgr workspaceMgr)
computeScoreForExamples in interface IOutputLayerfullNetworkL1 - L1 regularization term for the entire network (or, 0.0 to not include regularization)fullNetworkL2 - L2 regularization term for the entire network (or, 0.0 to not include regularization)public void computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
ModelcomputeGradientAndScore in interface ModelcomputeGradientAndScore in class BaseLayer<LossLayer>protected void setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z)
setScoreWithZ in class BaseLayer<LossLayer>public org.nd4j.linalg.primitives.Pair<Gradient,Double> gradientAndScore()
ModelgradientAndScore in interface ModelgradientAndScore in class AbstractLayer<LossLayer>public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseLayer<LossLayer>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic Gradient gradient()
public double calcL2(boolean backpropParamsOnly)
LayercalcL2 in interface LayercalcL2 in class BaseLayer<LossLayer>backpropParamsOnly - If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
LayercalcL1 in interface LayercalcL1 in class BaseLayer<LossLayer>backpropParamsOnly - If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layertype in interface Layertype in class AbstractLayer<LossLayer>public void fit(org.nd4j.linalg.api.ndarray.INDArray input,
LayerWorkspaceMgr workspaceMgr)
Modelpublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layeractivate in class BaseLayer<LossLayer>training - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic org.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layeractivate in class AbstractLayer<LossLayer>input - the input to usetraining - train or test modeworkspaceMgr - Workspace manager.ArrayType.ACTIVATIONS workspace via the workspace managerpublic Layer transpose()
Layerpublic boolean isPretrainLayer()
LayerisPretrainLayer in interface Layerpublic org.nd4j.linalg.api.ndarray.INDArray params()
BaseLayerpublic double f1Score(org.nd4j.linalg.dataset.api.DataSet data)
f1Score in interface Classifierdata - the data to scorepublic double f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
f1Score in interface Classifierexamples - te the examples to classify (one example in each row)labels - the true labelspublic int numLabels()
numLabels in interface Classifierpublic void fit(org.nd4j.linalg.dataset.api.iterator.DataSetIterator iter)
Classifierfit in interface Classifieriter - the iterator to train onpublic int[] predict(org.nd4j.linalg.api.ndarray.INDArray input)
predict in interface Classifierinput - the matrix to predictpublic List<String> predict(org.nd4j.linalg.dataset.api.DataSet dataSet)
predict in interface ClassifierdataSet - to predictpublic org.nd4j.linalg.api.ndarray.INDArray labelProbabilities(org.nd4j.linalg.api.ndarray.INDArray examples)
labelProbabilities in interface Classifierexamples - the examples to classify (one example in each row)public void fit(org.nd4j.linalg.api.ndarray.INDArray input,
org.nd4j.linalg.api.ndarray.INDArray labels)
fit in interface Classifierinput - the examples to classify (one example in each row)labels - the example labels(a binary outcome matrix)public void fit(org.nd4j.linalg.dataset.api.DataSet data)
fit in interface Classifierdata - the data to train onpublic void fit(org.nd4j.linalg.api.ndarray.INDArray examples,
int[] labels)
fit in interface Classifierexamples - the examples to classify (one example in each row)labels - the labels for each example (the number of labels must matchpublic void clear()
Modelpublic org.nd4j.linalg.api.ndarray.INDArray getLabels()
IOutputLayerIOutputLayer.setLabels(INDArray)getLabels in interface IOutputLayerpublic boolean needsLabels()
IOutputLayerneedsLabels in interface IOutputLayerpublic void setLabels(org.nd4j.linalg.api.ndarray.INDArray labels)
IOutputLayersetLabels in interface IOutputLayerlabels - Labels array to setprotected org.nd4j.linalg.api.ndarray.INDArray getLabels2d()
Copyright © 2018. All rights reserved.