public static class GravesBidirectionalLSTM.Builder extends BaseRecurrentLayer.Builder<GravesBidirectionalLSTM.Builder>
Modifier and Type | Field and Description |
---|---|
protected boolean |
helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed?
If set to false, an exception in CuDNN will be propagated back to the user.
|
inputWeightConstraints, recurrentConstraints, rnnDataFormat, weightInitFnRecurrent
nIn, nOut
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iupdater, regularization, regularizationBias, weightInitFn, weightNoise
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
Constructor and Description |
---|
Builder() |
Modifier and Type | Method and Description |
---|---|
GravesBidirectionalLSTM |
build() |
GravesBidirectionalLSTM.Builder |
forgetGateBiasInit(double biasInit)
Set forget gate bias initalizations.
|
GravesBidirectionalLSTM.Builder |
gateActivationFunction(Activation gateActivationFn)
Activation function for the LSTM gates.
|
GravesBidirectionalLSTM.Builder |
gateActivationFunction(IActivation gateActivationFn)
Activation function for the LSTM gates.
|
GravesBidirectionalLSTM.Builder |
gateActivationFunction(String gateActivationFn)
Activation function for the LSTM gates.
|
GravesBidirectionalLSTM.Builder |
helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed?
If set to false, an exception in the helper will be propagated back to the user.
|
constrainInputWeights, constrainRecurrent, dataFormat, weightInitRecurrent, weightInitRecurrent, weightInitRecurrent
nIn, nIn, nOut, nOut, units
activation, activation, biasInit, biasUpdater, dist, gainInit, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBias, weightInit, weightInit, weightInit, weightNoise
constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
protected boolean helperAllowFallback
public GravesBidirectionalLSTM.Builder forgetGateBiasInit(double biasInit)
public GravesBidirectionalLSTM.Builder gateActivationFunction(String gateActivationFn)
gateActivationFn
- Activation function for the LSTM gatespublic GravesBidirectionalLSTM.Builder gateActivationFunction(Activation gateActivationFn)
gateActivationFn
- Activation function for the LSTM gatespublic GravesBidirectionalLSTM.Builder gateActivationFunction(IActivation gateActivationFn)
gateActivationFn
- Activation function for the LSTM gatespublic GravesBidirectionalLSTM.Builder helperAllowFallback(boolean allowFallback)
allowFallback
- Whether fallback to non-helper implementation should be usedpublic GravesBidirectionalLSTM build()
build
in class Layer.Builder<GravesBidirectionalLSTM.Builder>
Copyright © 2022. All rights reserved.