public abstract static class BaseLayer.Builder<T extends BaseLayer.Builder<T>> extends Layer.Builder<T>
Modifier and Type | Field and Description |
---|---|
protected IActivation |
activationFn
Set the activation function for the layer.
|
protected double |
biasInit
Bias initialization value, for layers with biases.
|
protected IUpdater |
biasUpdater
Gradient updater configuration, for the biases only.
|
protected double |
gainInit
Gain initialization value, for layers with Layer Normalization.
|
protected GradientNormalization |
gradientNormalization
Gradient normalization strategy.
|
protected double |
gradientNormalizationThreshold
Threshold for gradient normalization, only used for GradientNormalization.ClipL2PerLayer,
GradientNormalization.ClipL2PerParamType, and GradientNormalization.ClipElementWiseAbsoluteValue
Not used otherwise. L2 threshold for first two types of clipping, or absolute value threshold for last type of clipping. |
protected IUpdater |
iupdater
Gradient updater.
|
protected List<Regularization> |
regularization
Regularization for the parameters (excluding biases).
|
protected List<Regularization> |
regularizationBias
Regularization for the bias parameters only
|
protected IWeightInit |
weightInitFn
Weight initialization scheme to use, for initial weight values
|
protected IWeightNoise |
weightNoise
Set the weight noise (such as
DropConnect and WeightNoise ) for this layer |
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
Constructor and Description |
---|
Builder() |
Modifier and Type | Method and Description |
---|---|
T |
activation(Activation activation)
Set the activation function for the layer, from an
Activation enumeration value. |
T |
activation(IActivation activationFunction)
Set the activation function for the layer.
|
T |
biasInit(double biasInit)
Bias initialization value, for layers with biases.
|
T |
biasUpdater(IUpdater biasUpdater)
Gradient updater configuration, for the biases only.
|
T |
dist(Distribution dist)
Deprecated.
|
T |
gainInit(double gainInit)
Gain initialization value, for layers with Layer Normalization.
|
T |
gradientNormalization(GradientNormalization gradientNormalization)
Gradient normalization strategy.
|
T |
gradientNormalizationThreshold(double threshold)
Threshold for gradient normalization, only used for GradientNormalization.ClipL2PerLayer,
GradientNormalization.ClipL2PerParamType, and GradientNormalization.ClipElementWiseAbsoluteValue
Not used otherwise. L2 threshold for first two types of clipping, or absolute value threshold for last type of clipping. |
T |
l1(double l1)
L1 regularization coefficient (weights only).
|
T |
l1Bias(double l1Bias)
L1 regularization coefficient for the bias.
|
T |
l2(double l2)
L2 regularization coefficient (weights only).
|
T |
l2Bias(double l2Bias)
L2 regularization coefficient for the bias.
|
BaseLayer.Builder |
regularization(List<Regularization> regularization)
Set the regularization for the parameters (excluding biases) - for example
WeightDecay |
BaseLayer.Builder |
regularizationBias(List<Regularization> regularizationBias)
Set the regularization for the biases only - for example
WeightDecay |
T |
updater(IUpdater updater)
Gradient updater.
|
T |
updater(Updater updater)
Deprecated.
|
BaseLayer.Builder |
weightDecay(double coefficient)
Add weight decay regularization for the network parameters (excluding biases).
This applies weight decay with multiplying the learning rate - see WeightDecay for more details. |
BaseLayer.Builder |
weightDecay(double coefficient,
boolean applyLR)
Add weight decay regularization for the network parameters (excluding biases).
|
BaseLayer.Builder |
weightDecayBias(double coefficient)
Weight decay for the biases only - see
weightDecay(double) for more details. |
BaseLayer.Builder |
weightDecayBias(double coefficient,
boolean applyLR)
Weight decay for the biases only - see
weightDecay(double) for more details |
T |
weightInit(Distribution distribution)
Set weight initialization scheme to random sampling via the specified distribution.
|
T |
weightInit(IWeightInit weightInit)
Weight initialization scheme to use, for initial weight values
|
T |
weightInit(WeightInit weightInit)
Weight initialization scheme to use, for initial weight values
|
T |
weightNoise(IWeightNoise weightNoise)
Set the weight noise (such as
DropConnect and WeightNoise ) for this layer |
build, constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
protected IActivation activationFn
IActivation
instancesprotected IWeightInit weightInitFn
IWeightInit
protected double biasInit
protected double gainInit
protected List<Regularization> regularization
protected List<Regularization> regularizationBias
protected IUpdater iupdater
protected IUpdater biasUpdater
updater(IUpdater)
protected GradientNormalization gradientNormalization
GradientNormalization
protected double gradientNormalizationThreshold
protected IWeightNoise weightNoise
DropConnect
and WeightNoise
) for this layerpublic T activation(IActivation activationFunction)
IActivation
instancesactivationFunction
- Activation function to use for the layerpublic T activation(Activation activation)
Activation
enumeration value.activation
- Activation function to use for the layerpublic T weightInit(IWeightInit weightInit)
IWeightInit
public T weightInit(WeightInit weightInit)
WeightInit
public T weightInit(Distribution distribution)
.weightInit(new WeightInitDistribution(distribution))
distribution
- Distribution to use for weight initializationpublic T biasInit(double biasInit)
biasInit
- Value to use for initializing biasespublic T gainInit(double gainInit)
gainInit
- Value to use for initializing gain@Deprecated public T dist(Distribution dist)
.weightInit(new
WeightInitDistribution(distribution))
public T l1(double l1)
l1Bias(double)
to configure the l1 regularization
coefficient for the bias.public T l2(double l2)
l2Bias(double)
to configure the l2 regularization
coefficient for the bias.WeightDecay
(set via weightDecay(double,boolean)
should be preferred to
L2 regularization. See WeightDecay
javadoc for further details.public T l1Bias(double l1Bias)
l1(double)
public T l2Bias(double l2Bias)
l2(double)
WeightDecay
(set via weightDecayBias(double,boolean)
should be preferred to
L2 regularization. See WeightDecay
javadoc for further details.public BaseLayer.Builder weightDecay(double coefficient)
WeightDecay
for more details.coefficient
- Weight decay regularization coefficientweightDecay(double, boolean)
public BaseLayer.Builder weightDecay(double coefficient, boolean applyLR)
WeightDecay
for more details.coefficient
- Weight decay regularization coefficientapplyLR
- Whether the learning rate should be multiplied in when performing weight decay updates. See WeightDecay
for more details.weightDecay(double, boolean)
public BaseLayer.Builder weightDecayBias(double coefficient)
weightDecay(double)
for more details.
This applies weight decay with multiplying the learning rate.coefficient
- Weight decay regularization coefficientweightDecayBias(double, boolean)
public BaseLayer.Builder weightDecayBias(double coefficient, boolean applyLR)
weightDecay(double)
for more detailscoefficient
- Weight decay regularization coefficientpublic BaseLayer.Builder regularization(List<Regularization> regularization)
WeightDecay
regularization
- Regularization to apply for the network parameters/weights (excluding biases)public BaseLayer.Builder regularizationBias(List<Regularization> regularizationBias)
WeightDecay
regularizationBias
- Regularization to apply for the network biases only@Deprecated public T updater(Updater updater)
Updater
public T biasUpdater(IUpdater biasUpdater)
updater(IUpdater)
biasUpdater
- Updater to use for bias parameterspublic T gradientNormalization(GradientNormalization gradientNormalization)
gradientNormalization
- Type of normalization to use. Defaults to None.GradientNormalization
public T gradientNormalizationThreshold(double threshold)
public T weightNoise(IWeightNoise weightNoise)
DropConnect
and WeightNoise
) for this layerweightNoise
- Weight noise instance to useCopyright © 2019. All rights reserved.