Interface | Description |
---|---|
LSTMHelper |
Helper for the recurrent LSTM layer (no peephole connections).
|
Class | Description |
---|---|
BaseRecurrentLayer<LayerConfT extends BaseLayer> | |
BidirectionalLayer |
Bidirectional is a "wrapper" layer: it wraps any uni-directional RNN layer to make it bidirectional.
Note that multiple different modes are supported - these specify how the activations should be combined from the forward and backward RNN networks. |
FwdPassReturn |
Created by benny on 12/31/15.
|
GravesBidirectionalLSTM |
RNN tutorial: http://deeplearning4j.org/usingrnns.html
READ THIS FIRST
Bdirectional LSTM layer implementation.
|
GravesLSTM |
LSTM layer implementation.
|
LastTimeStepLayer |
LastTimeStep is a "wrapper" layer: it wraps any RNN layer, and extracts out the last time step during forward pass,
and returns it as a row vector (per example).
|
LSTM |
LSTM layer implementation.
|
LSTMHelpers |
RNN tutorial: http://deeplearning4j.org/usingrnns.html
READ THIS FIRST if you want to understand what the heck is happening here.
|
MaskZeroLayer |
Masks timesteps with 0 activation.
|
RnnLossLayer |
Recurrent Neural Network Loss Layer.
Handles calculation of gradients etc for various objective functions. NOTE: Unlike RnnOutputLayer this RnnLossLayer does not have any parameters - i.e., there is no time
distributed dense component here. |
RnnOutputLayer |
Recurrent Neural Network Output Layer.
Handles calculation of gradients etc for various objective functions. Functionally the same as OutputLayer, but handles output and label reshaping automatically. Input and output activations are same as other RNN layers: 3 dimensions with shape [miniBatchSize,nIn,timeSeriesLength] and [miniBatchSize,nOut,timeSeriesLength] respectively. |
SimpleRnn |
Simple RNN - aka "vanilla" RNN is the simplest type of recurrent neural network layer.
|
Copyright © 2018. All rights reserved.