Package | Description |
---|---|
org.deeplearning4j.nn.layers.recurrent |
Modifier and Type | Class and Description |
---|---|
class |
BaseRecurrentLayer<LayerConfT extends BaseLayer> |
class |
BidirectionalLayer
Bidirectional is a "wrapper" layer: it wraps any uni-directional RNN layer to make it bidirectional.
Note that multiple different modes are supported - these specify how the activations should be combined from the forward and backward RNN networks. |
class |
GravesBidirectionalLSTM
RNN tutorial: http://deeplearning4j.org/usingrnns.html
READ THIS FIRST
Bdirectional LSTM layer implementation.
|
class |
GravesLSTM
Deprecated.
Will be eventually removed. Use
LSTM instead, which has similar prediction accuracy, but supports
CuDNN for faster network training on CUDA (Nvidia) GPUs |
class |
LSTM
LSTM layer implementation.
|
class |
SimpleRnn
Simple RNN - aka "vanilla" RNN is the simplest type of recurrent neural network layer.
|
Copyright © 2018. All rights reserved.