public class CnnLossLayer extends BaseLayer<CnnLossLayer> implements IOutputLayer
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected INDArray |
labels |
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParams
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
CnnLossLayer(NeuralNetConfiguration conf,
DataType dataType) |
Modifier and Type | Method and Description |
---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcRegularizationScore(boolean backpropParamsOnly)
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss function |
double |
computeScore(double fullNetRegTerm,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.
|
INDArray |
computeScoreForExamples(double fullNetRegTerm,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
double |
f1Score(DataSet data)
Sets the input and labels and returns a score for the prediction
wrt true labels
|
double |
f1Score(INDArray examples,
INDArray labels)
Returns the f1 score for the given examples.
|
Pair<INDArray,MaskState> |
feedForwardMaskArray(INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in the layer as appropriate.
|
void |
fit(DataSet data)
Fit the model
|
void |
fit(DataSetIterator iter)
Train the model based on the datasetiterator
|
void |
fit(INDArray examples,
INDArray labels)
Fit the model
|
void |
fit(INDArray examples,
int[] labels)
Fit the model
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
boolean |
needsLabels()
Returns true if labels are required
for this output layer
|
int |
numLabels()
Returns the number of possible labels
|
List<String> |
predict(DataSet dataSet)
Takes in a DataSet of examples
For each row, returns a label
|
int[] |
predict(INDArray examples)
Takes in a list of examples
For each row, returns a label
|
void |
setMaskArray(INDArray maskArray)
Set the mask array.
|
Layer.Type |
type()
Returns the layer type
|
clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, hasBias, hasLayerNorm, layerConf, numParams, params, paramTable, paramTable, preOutput, preOutputWithPreNorm, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, update
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, close, conf, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, updaterDivideByMinibatch
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
getLabels, setLabels
activate, allowInputModification, clearNoiseWeightParams, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners
getConfig, getGradientsViewArray, numParams, params, paramTable, updaterDivideByMinibatch
addListeners, applyConstraints, batchSize, clear, close, computeGradientAndScore, conf, fit, fit, getGradientsViewArray, getOptimizer, getParam, gradient, gradientAndScore, init, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update
protected INDArray labels
public CnnLossLayer(NeuralNetConfiguration conf, DataType dataType)
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseLayer<CnnLossLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic double calcRegularizationScore(boolean backpropParamsOnly)
Layer
calcRegularizationScore
in interface Layer
calcRegularizationScore
in class BaseLayer<CnnLossLayer>
backpropParamsOnly
- If true: calculate regularization score based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double f1Score(DataSet data)
Classifier
f1Score
in interface Classifier
data
- the data to scorepublic double f1Score(INDArray examples, INDArray labels)
f1Score
in interface Classifier
examples
- te the examples to classify (one example in each row)labels
- the true labelspublic int numLabels()
Classifier
numLabels
in interface Classifier
public void fit(DataSetIterator iter)
Classifier
fit
in interface Classifier
iter
- the iterator to train onpublic int[] predict(INDArray examples)
Classifier
predict
in interface Classifier
examples
- the examples to classify (one example in each row)public List<String> predict(DataSet dataSet)
Classifier
predict
in interface Classifier
dataSet
- the examples to classifypublic void fit(INDArray examples, INDArray labels)
Classifier
fit
in interface Classifier
examples
- the examples to classify (one example in each row)labels
- the example labels(a binary outcome matrix)public void fit(DataSet data)
Classifier
fit
in interface Classifier
data
- the data to train onpublic void fit(INDArray examples, int[] labels)
Classifier
fit
in interface Classifier
examples
- the examples to classify (one example in each row)labels
- the labels for each example (the number of labels must match
the number of rows in the examplepublic Layer.Type type()
Layer
type
in interface Layer
type
in class AbstractLayer<CnnLossLayer>
public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
activate
in interface Layer
activate
in class BaseLayer<CnnLossLayer>
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic void setMaskArray(INDArray maskArray)
Layer
Layer.feedForwardMaskArray(INDArray, MaskState, int)
should be used in
preference to this.setMaskArray
in interface Layer
setMaskArray
in class AbstractLayer<CnnLossLayer>
maskArray
- Mask array to setpublic boolean isPretrainLayer()
Layer
isPretrainLayer
in interface Layer
public Pair<INDArray,MaskState> feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState, int minibatchSize)
Layer
feedForwardMaskArray
in interface Layer
feedForwardMaskArray
in class AbstractLayer<CnnLossLayer>
maskArray
- Mask array to setcurrentMaskState
- Current state of the mask - see MaskState
minibatchSize
- Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)public boolean needsLabels()
IOutputLayer
needsLabels
in interface IOutputLayer
public double computeScore(double fullNetRegTerm, boolean training, LayerWorkspaceMgr workspaceMgr)
IOutputLayer
computeScore
in interface IOutputLayer
fullNetRegTerm
- Regularization score (l1/l2/weight decay) for the entire networktraining
- whether score should be calculated at train or test time (this affects things like application of
dropout, etc)public INDArray computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
computeScoreForExamples
in interface IOutputLayer
fullNetRegTerm
- Regularization score term for the entire network (or, 0.0 to not include regularization)Copyright © 2021. All rights reserved.