public class RnnOutputLayer extends BaseOutputLayer<RnnOutputLayer>
BaseOutputLayer, OutputLayer,
Serialized FormLayer.TrainingMode, Layer.TypeinputMaskArray, inputMaskArrayState, labelsgradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, weightNoiseParamscacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
RnnOutputLayer(NeuralNetConfiguration conf) |
RnnOutputLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
org.nd4j.linalg.api.ndarray.INDArray |
computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
double |
f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
Returns the f1 score for the given examples.
|
org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> |
feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in in the layer as appropriate.
|
org.nd4j.linalg.api.ndarray.INDArray |
getInput() |
protected org.nd4j.linalg.api.ndarray.INDArray |
getLabels2d(LayerWorkspaceMgr workspaceMgr,
ArrayType arrayType) |
protected org.nd4j.linalg.api.ndarray.INDArray |
preOutput2d(boolean training,
LayerWorkspaceMgr workspaceMgr) |
void |
setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
Set the mask array.
|
Layer.Type |
type()
Returns the layer type
|
activate, applyMask, clear, computeGradientAndScore, computeScore, f1Score, fit, fit, fit, fit, fit, getLabels, gradient, gradientAndScore, hasBias, isPretrainLayer, labelProbabilities, needsLabels, numLabels, predict, predict, setLabels, setScoreWithZaccumulateScore, calcL1, calcL2, clearNoiseWeightParams, clone, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, initParams, layerConf, numParams, params, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, transpose, update, updateaddListeners, applyConstraints, applyDropOutIfNecessary, assertInputSet, batchSize, conf, getIndex, getInputMiniBatchSize, getListeners, getMaskArray, init, input, layerId, numParams, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, validateInputequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitcalcL1, calcL2, clearNoiseWeightParams, clone, getEpochCount, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, transposeaccumulateScore, addListeners, applyConstraints, batchSize, conf, fit, getGradientsViewArray, getOptimizer, getParam, init, initParams, input, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInputpublic RnnOutputLayer(NeuralNetConfiguration conf)
public RnnOutputLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseOutputLayer<RnnOutputLayer>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic double f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
f1Score in interface Classifierf1Score in class BaseOutputLayer<RnnOutputLayer>examples - te the examples to classify (one example in each row)labels - the true labelspublic org.nd4j.linalg.api.ndarray.INDArray getInput()
getInput in class AbstractLayer<RnnOutputLayer>public Layer.Type type()
Layertype in interface Layertype in class AbstractLayer<RnnOutputLayer>protected org.nd4j.linalg.api.ndarray.INDArray preOutput2d(boolean training,
LayerWorkspaceMgr workspaceMgr)
preOutput2d in class BaseOutputLayer<RnnOutputLayer>protected org.nd4j.linalg.api.ndarray.INDArray getLabels2d(LayerWorkspaceMgr workspaceMgr, ArrayType arrayType)
getLabels2d in class BaseOutputLayer<RnnOutputLayer>public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layeractivate in class BaseLayer<RnnOutputLayer>training - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic void setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
LayerLayer.feedForwardMaskArray(INDArray, MaskState, int) should be used in
preference to this.setMaskArray in interface LayersetMaskArray in class AbstractLayer<RnnOutputLayer>maskArray - Mask array to setpublic org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray, MaskState currentMaskState, int minibatchSize)
LayerfeedForwardMaskArray in interface LayerfeedForwardMaskArray in class AbstractLayer<RnnOutputLayer>maskArray - Mask array to setcurrentMaskState - Current state of the mask - see MaskStateminibatchSize - Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)public org.nd4j.linalg.api.ndarray.INDArray computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2,
LayerWorkspaceMgr workspaceMgr)
computeScoreForExamples in interface IOutputLayercomputeScoreForExamples in class BaseOutputLayer<RnnOutputLayer>fullNetworkL1 - L1 regularization term for the entire network (or, 0.0 to not include regularization)fullNetworkL2 - L2 regularization term for the entire network (or, 0.0 to not include regularization)Copyright © 2018. All rights reserved.