public class RnnOutputLayer extends BaseOutputLayer<RnnOutputLayer>
BaseOutputLayer, OutputLayer
,
Serialized FormLayer.TrainingMode, Layer.Type
inputMaskArray, inputMaskArrayState, labels
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, weightNoiseParams
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
RnnOutputLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.buffer.DataType dataType) |
Modifier and Type | Method and Description |
---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
INDArray |
computeScoreForExamples(double fullNetRegTerm,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
double |
f1Score(INDArray examples,
INDArray labels)
Returns the f1 score for the given examples.
|
Pair<INDArray,MaskState> |
feedForwardMaskArray(INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in the layer as appropriate.
|
INDArray |
getInput() |
protected INDArray |
getLabels2d(LayerWorkspaceMgr workspaceMgr,
ArrayType arrayType) |
protected INDArray |
preOutput2d(boolean training,
LayerWorkspaceMgr workspaceMgr) |
void |
setMaskArray(INDArray maskArray)
Set the mask array.
|
Layer.Type |
type()
Returns the layer type
|
activate, applyMask, clear, computeGradientAndScore, computeScore, f1Score, fit, fit, fit, fit, fit, getLabels, gradient, gradientAndScore, hasBias, isPretrainLayer, needsLabels, numLabels, predict, predict, setLabels, setScoreWithZ
calcRegularizationScore, clearNoiseWeightParams, clone, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, hasLayerNorm, layerConf, numParams, params, paramTable, paramTable, preOutput, preOutputWithPreNorm, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, update, update
addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, assertInputSet, backpropDropOutIfPresent, batchSize, conf, getConfig, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getListeners, getMaskArray, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, updaterDivideByMinibatch
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
allowInputModification, calcRegularizationScore, clearNoiseWeightParams, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners
getConfig, getGradientsViewArray, numParams, params, paramTable, updaterDivideByMinibatch
addListeners, applyConstraints, batchSize, conf, fit, getGradientsViewArray, getOptimizer, getParam, init, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update
public RnnOutputLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.buffer.DataType dataType)
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseOutputLayer<RnnOutputLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic double f1Score(INDArray examples, INDArray labels)
f1Score
in interface Classifier
f1Score
in class BaseOutputLayer<RnnOutputLayer>
examples
- te the examples to classify (one example in each row)labels
- the true labelspublic INDArray getInput()
getInput
in class AbstractLayer<RnnOutputLayer>
public Layer.Type type()
Layer
type
in interface Layer
type
in class AbstractLayer<RnnOutputLayer>
protected INDArray preOutput2d(boolean training, LayerWorkspaceMgr workspaceMgr)
preOutput2d
in class BaseOutputLayer<RnnOutputLayer>
protected INDArray getLabels2d(LayerWorkspaceMgr workspaceMgr, ArrayType arrayType)
getLabels2d
in class BaseOutputLayer<RnnOutputLayer>
public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
activate
in interface Layer
activate
in class BaseLayer<RnnOutputLayer>
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic void setMaskArray(INDArray maskArray)
Layer
Layer.feedForwardMaskArray(INDArray, MaskState, int)
should be used in
preference to this.setMaskArray
in interface Layer
setMaskArray
in class AbstractLayer<RnnOutputLayer>
maskArray
- Mask array to setpublic Pair<INDArray,MaskState> feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState, int minibatchSize)
Layer
feedForwardMaskArray
in interface Layer
feedForwardMaskArray
in class AbstractLayer<RnnOutputLayer>
maskArray
- Mask array to setcurrentMaskState
- Current state of the mask - see MaskState
minibatchSize
- Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)public INDArray computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
computeScoreForExamples
in interface IOutputLayer
computeScoreForExamples
in class BaseOutputLayer<RnnOutputLayer>
fullNetRegTerm
- Regularization score term for the entire network (or, 0.0 to not include regularization)Copyright © 2019. All rights reserved.