Package | Description |
---|---|
org.deeplearning4j.nn.layers.recurrent |
Modifier and Type | Class and Description |
---|---|
class |
GravesBidirectionalLSTM
RNN tutorial: https://deeplearning4j.org/docs/latest/deeplearning4j-nn-recurrent
READ THIS FIRST
Bdirectional LSTM layer implementation.
|
class |
GravesLSTM
Deprecated.
Will be eventually removed. Use
LSTM instead, which has similar prediction accuracy, but supports
CuDNN for faster network training on CUDA (Nvidia) GPUs |
class |
LSTM
LSTM layer implementation.
|
class |
SimpleRnn
Simple RNN - aka "vanilla" RNN is the simplest type of recurrent neural network layer.
|
Modifier and Type | Method and Description |
---|---|
static FwdPassReturn |
LSTMHelpers.activateHelper(BaseRecurrentLayer layer,
NeuralNetConfiguration conf,
IActivation gateActivationFn,
INDArray input,
INDArray recurrentWeights,
INDArray originalInputWeights,
INDArray biases,
boolean training,
INDArray originalPrevOutputActivations,
INDArray originalPrevMemCellState,
boolean forBackprop,
boolean forwards,
String inputWeightKey,
INDArray maskArray,
boolean hasPeepholeConnections,
LSTMHelper helper,
CacheMode cacheMode,
LayerWorkspaceMgr workspaceMgr,
boolean isHelperAllowFallback)
Returns FwdPassReturn object with activations/INDArrays.
|
static Pair<Gradient,INDArray> |
LSTMHelpers.backpropGradientHelper(BaseRecurrentLayer layer,
NeuralNetConfiguration conf,
IActivation gateActivationFn,
INDArray input,
INDArray recurrentWeights,
INDArray inputWeights,
INDArray epsilon,
boolean truncatedBPTT,
int tbpttBackwardLength,
FwdPassReturn fwdPass,
boolean forwards,
String inputWeightKey,
String recurrentWeightKey,
String biasWeightKey,
Map<String,INDArray> gradientViews,
INDArray maskArray,
boolean hasPeepholeConnections,
LSTMHelper helper,
LayerWorkspaceMgr workspaceMgr,
boolean isHelperAllowFallback) |
Copyright © 2019. All rights reserved.