Class GravesBidirectionalLSTM.Builder
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.BaseLayer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder<T>
-
- org.deeplearning4j.nn.conf.layers.BaseRecurrentLayer.Builder<GravesBidirectionalLSTM.Builder>
-
- org.deeplearning4j.nn.conf.layers.GravesBidirectionalLSTM.Builder
-
- Enclosing class:
- GravesBidirectionalLSTM
public static class GravesBidirectionalLSTM.Builder extends BaseRecurrentLayer.Builder<GravesBidirectionalLSTM.Builder>
-
-
Field Summary
Fields Modifier and Type Field Description protected boolean
helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed? If set to false, an exception in CuDNN will be propagated back to the user.-
Fields inherited from class org.deeplearning4j.nn.conf.layers.BaseRecurrentLayer.Builder
inputWeightConstraints, recurrentConstraints, rnnDataFormat, weightInitFnRecurrent
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder
nIn, nOut
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.BaseLayer.Builder
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iupdater, regularization, regularizationBias, weightInitFn, weightNoise
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
-
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description GravesBidirectionalLSTM
build()
GravesBidirectionalLSTM.Builder
forgetGateBiasInit(double biasInit)
Set forget gate bias initalizations.GravesBidirectionalLSTM.Builder
gateActivationFunction(String gateActivationFn)
Activation function for the LSTM gates.GravesBidirectionalLSTM.Builder
gateActivationFunction(Activation gateActivationFn)
Activation function for the LSTM gates.GravesBidirectionalLSTM.Builder
gateActivationFunction(IActivation gateActivationFn)
Activation function for the LSTM gates.GravesBidirectionalLSTM.Builder
helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed? If set to false, an exception in the helper will be propagated back to the user.-
Methods inherited from class org.deeplearning4j.nn.conf.layers.BaseRecurrentLayer.Builder
constrainInputWeights, constrainRecurrent, dataFormat, weightInitRecurrent, weightInitRecurrent, weightInitRecurrent
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.FeedForwardLayer.Builder
nIn, nIn, nOut, nOut, units
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.BaseLayer.Builder
activation, activation, biasInit, biasUpdater, dist, gainInit, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBias, weightInit, weightInit, weightInit, weightNoise
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer.Builder
constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
-
-
-
-
Field Detail
-
helperAllowFallback
protected boolean helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed? If set to false, an exception in CuDNN will be propagated back to the user. If false, the built-in (non-CuDNN) implementation for GravesBidirectionalLSTM will be used
-
-
Method Detail
-
forgetGateBiasInit
public GravesBidirectionalLSTM.Builder forgetGateBiasInit(double biasInit)
Set forget gate bias initalizations. Values in range 1-5 can potentially help with learning or longer-term dependencies.
-
gateActivationFunction
public GravesBidirectionalLSTM.Builder gateActivationFunction(String gateActivationFn)
Activation function for the LSTM gates. Note: This should be bounded to range 0-1: sigmoid or hard sigmoid, for example- Parameters:
gateActivationFn
- Activation function for the LSTM gates
-
gateActivationFunction
public GravesBidirectionalLSTM.Builder gateActivationFunction(Activation gateActivationFn)
Activation function for the LSTM gates. Note: This should be bounded to range 0-1: sigmoid or hard sigmoid, for example- Parameters:
gateActivationFn
- Activation function for the LSTM gates
-
gateActivationFunction
public GravesBidirectionalLSTM.Builder gateActivationFunction(IActivation gateActivationFn)
Activation function for the LSTM gates. Note: This should be bounded to range 0-1: sigmoid or hard sigmoid, for example- Parameters:
gateActivationFn
- Activation function for the LSTM gates
-
helperAllowFallback
public GravesBidirectionalLSTM.Builder helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed? If set to false, an exception in the helper will be propagated back to the user. If false, the built-in (non-helper) implementation for GravesBidirectionalLSTM will be used- Parameters:
allowFallback
- Whether fallback to non-helper implementation should be used
-
build
public GravesBidirectionalLSTM build()
- Specified by:
build
in classLayer.Builder<GravesBidirectionalLSTM.Builder>
-
-