public class SimpleRnn extends BaseRecurrentLayer
out_t =
activationFn( in_t * inWeight + out_(t-1) * recurrentWeights + bias)
.
Note that other architectures (LSTM, etc) are usually much more effective, especially for longer time series; however
SimpleRnn is very fast to compute, and hence may be considered where the length of the temporal dependencies in the
dataset are only a few steps long.Modifier and Type | Class and Description |
---|---|
static class |
SimpleRnn.Builder |
weightInitFnRecurrent
nIn, nOut
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoise
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
SimpleRnn(SimpleRnn.Builder builder) |
Modifier and Type | Method and Description |
---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
boolean |
hasLayerNorm() |
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
org.nd4j.linalg.api.buffer.DataType networkDataType) |
getOutputType, getPreProcessorForInputType, setNIn
isPretrainParam
clone, getGradientNormalization, getRegularizationByParam, getUpdaterByParam, resetLayerDefaultConfig
initializeConstraints, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getGradientNormalizationThreshold, getLayerName
protected SimpleRnn(SimpleRnn.Builder builder)
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, org.nd4j.linalg.api.buffer.DataType networkDataType)
instantiate
in class Layer
public ParamInitializer initializer()
initializer
in class Layer
public LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input
typepublic boolean hasLayerNorm()
Copyright © 2019. All rights reserved.