public abstract static class AbstractLSTM.Builder<T extends AbstractLSTM.Builder<T>> extends BaseRecurrentLayer.Builder<T>
Modifier and Type | Field and Description |
---|---|
protected double |
forgetGateBiasInit |
protected org.nd4j.linalg.activations.IActivation |
gateActivationFn |
nIn, nOut
activationFn, adamMeanDecay, adamVarDecay, biasInit, biasLearningRate, dist, epsilon, gradientNormalization, gradientNormalizationThreshold, iupdater, l1, l1Bias, l2, l2Bias, learningRate, learningRatePolicy, learningRateSchedule, momentum, momentumAfter, rho, rmsDecay, updater, weightInit
dropOut, layerName
Constructor and Description |
---|
Builder() |
Modifier and Type | Method and Description |
---|---|
T |
forgetGateBiasInit(double biasInit)
Set forget gate bias initalizations.
|
T |
gateActivationFunction(org.nd4j.linalg.activations.Activation gateActivationFn)
Activation function for the LSTM gates.
|
T |
gateActivationFunction(org.nd4j.linalg.activations.IActivation gateActivationFn)
Activation function for the LSTM gates.
|
T |
gateActivationFunction(String gateActivationFn)
Activation function for the LSTM gates.
|
nIn, nOut
activation, activation, activation, adamMeanDecay, adamVarDecay, biasInit, biasLearningRate, dist, epsilon, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, learningRate, learningRateDecayPolicy, learningRateSchedule, momentum, momentumAfter, rho, rmsDecay, updater, updater, weightInit
build, dropOut, name
protected double forgetGateBiasInit
protected org.nd4j.linalg.activations.IActivation gateActivationFn
public T forgetGateBiasInit(double biasInit)
public T gateActivationFunction(String gateActivationFn)
gateActivationFn
- Activation function for the LSTM gatespublic T gateActivationFunction(org.nd4j.linalg.activations.Activation gateActivationFn)
gateActivationFn
- Activation function for the LSTM gatespublic T gateActivationFunction(org.nd4j.linalg.activations.IActivation gateActivationFn)
gateActivationFn
- Activation function for the LSTM gatesCopyright © 2017. All rights reserved.