Bidirectional
instead. With the Bidirectional
layer wrapper you can make any recurrent layer bidirectional, in particular GravesLSTM. Note that this layer adds the
output of both directions, which translates into "ADD" mode in Bidirectional.
Usage: .layer(new Bidirectional(Bidirectional.Mode.ADD, new GravesLSTM.Builder()....build()))
@Deprecated public class GravesBidirectionalLSTM extends BaseRecurrentLayer
Modifier and Type | Class and Description |
---|---|
static class |
GravesBidirectionalLSTM.Builder
Deprecated.
|
Modifier and Type | Field and Description |
---|---|
protected boolean |
helperAllowFallback
Deprecated.
|
weightInitFnRecurrent
nIn, nOut
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoise
constraints, iDropout, layerName
Modifier and Type | Method and Description |
---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
Deprecated.
This is a report of the estimated memory consumption for the given layer
|
protected void |
initializeConstraints(Layer.Builder<?> builder)
Deprecated.
Initialize the weight constraints.
|
ParamInitializer |
initializer()
Deprecated.
|
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
org.nd4j.linalg.api.buffer.DataType networkDataType)
Deprecated.
|
getOutputType, getPreProcessorForInputType, setNIn
isPretrainParam
clone, getGradientNormalization, getRegularizationByParam, getUpdaterByParam, resetLayerDefaultConfig
setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getGradientNormalizationThreshold, getLayerName
protected void initializeConstraints(Layer.Builder<?> builder)
Layer
initializeConstraints
in class Layer
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, org.nd4j.linalg.api.buffer.DataType networkDataType)
instantiate
in class Layer
public ParamInitializer initializer()
initializer
in class Layer
public LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input
typeCopyright © 2019. All rights reserved.