public class LastTimeStepLayer extends BaseWrapperLayer
Layer.TrainingMode, Layer.Typeunderlying| Constructor and Description |
|---|
LastTimeStepLayer(Layer underlying) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the specified input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> |
feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in in the layer as appropriate.
|
Layer.Type |
type()
Returns the layer type
|
accumulateScore, addListeners, applyConstraints, batchSize, calcL1, calcL2, clear, clearNoiseWeightParams, clone, computeGradientAndScore, conf, fit, fit, getEpochCount, getGradientsViewArray, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, getOptimizer, getParam, gradient, gradientAndScore, init, initParams, input, isPretrainLayer, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, transpose, update, update, validateInputpublic LastTimeStepLayer(@NonNull
Layer underlying)
public Layer.Type type()
Layertype in interface Layertype in class BaseWrapperLayerpublic org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseWrapperLayerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layeractivate in class BaseWrapperLayertraining - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic org.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layeractivate in class BaseWrapperLayerinput - the input to usetraining - train or test modeworkspaceMgr - Workspace manager.ArrayType.ACTIVATIONS workspace via the workspace managerpublic org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray, MaskState currentMaskState, int minibatchSize)
LayerfeedForwardMaskArray in interface LayerfeedForwardMaskArray in class BaseWrapperLayermaskArray - Mask array to setcurrentMaskState - Current state of the mask - see MaskStateminibatchSize - Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)Copyright © 2018. All rights reserved.