Bidirectional
instead. With the
Bidirectional layer wrapper you can make any recurrent layer bidirectional, in particular GravesLSTM.
Note that this layer adds the output of both directions, which translates into "ADD" mode in Bidirectional.
Usage: .layer(new Bidirectional(Bidirectional.Mode.ADD, new GravesLSTM.Builder()....build()))
@Deprecated public class GravesBidirectionalLSTM extends BaseRecurrentLayer
Modifier and Type | Class and Description |
---|---|
static class |
GravesBidirectionalLSTM.Builder
Deprecated.
|
distRecurrent, weightInitRecurrent
nIn, nOut
activationFn, biasInit, biasUpdater, dist, gradientNormalization, gradientNormalizationThreshold, iUpdater, l1, l1Bias, l2, l2Bias, weightInit, weightNoise
constraints, iDropout, layerName
Modifier and Type | Method and Description |
---|---|
double |
getL1ByParam(String paramName)
Deprecated.
Get the L1 coefficient for the given parameter.
|
double |
getL2ByParam(String paramName)
Deprecated.
Get the L2 coefficient for the given parameter.
|
LayerMemoryReport |
getMemoryReport(InputType inputType)
Deprecated.
This is a report of the estimated memory consumption for the given layer
|
protected void |
initializeConstraints(Layer.Builder<?> builder)
Deprecated.
Initialize the weight constraints.
|
ParamInitializer |
initializer()
Deprecated.
|
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams)
Deprecated.
|
getOutputType, getPreProcessorForInputType, isPretrain, setNIn
isPretrainParam
clone, getGradientNormalization, getUpdaterByParam, resetLayerDefaultConfig
setPretrain
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getGradientNormalizationThreshold, getLayerName
protected void initializeConstraints(Layer.Builder<?> builder)
Layer
initializeConstraints
in class Layer
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams)
instantiate
in class Layer
public ParamInitializer initializer()
initializer
in class Layer
public double getL1ByParam(String paramName)
Layer
getL1ByParam
in interface TrainingConfig
getL1ByParam
in class FeedForwardLayer
paramName
- Parameter namepublic double getL2ByParam(String paramName)
Layer
getL2ByParam
in interface TrainingConfig
getL2ByParam
in class FeedForwardLayer
paramName
- Parameter namepublic LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input typeCopyright © 2018. All rights reserved.