public abstract static class AbstractSameDiffLayer.Builder<T extends AbstractSameDiffLayer.Builder<T>> extends Layer.Builder<T>
Modifier and Type | Field and Description |
---|---|
protected IUpdater |
biasUpdater
Gradient updater configuration, for the biases only.
|
protected List<Regularization> |
regularization |
protected List<Regularization> |
regularizationBias |
protected IUpdater |
updater
Gradient updater.
|
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
Constructor and Description |
---|
Builder() |
Modifier and Type | Method and Description |
---|---|
T |
biasUpdater(IUpdater biasUpdater)
Gradient updater configuration, for the biases only.
|
T |
l1(double l1)
L1 regularization coefficient (weights only).
|
T |
l1Bias(double l1Bias)
L1 regularization coefficient for the bias.
|
T |
l2(double l2)
L2 regularization coefficient (weights only).
|
T |
l2Bias(double l2Bias)
L2 regularization coefficient for the bias.
|
AbstractSameDiffLayer.Builder |
regularization(List<Regularization> regularization)
Set the regularization for the parameters (excluding biases) - for example
WeightDecay |
AbstractSameDiffLayer.Builder |
regularizationBias(List<Regularization> regularizationBias)
Set the regularization for the biases only - for example
WeightDecay |
T |
updater(IUpdater updater)
Gradient updater.
|
AbstractSameDiffLayer.Builder |
weightDecay(double coefficient)
Add weight decay regularization for the network parameters (excluding biases).
This applies weight decay with multiplying the learning rate - see WeightDecay for more details. |
AbstractSameDiffLayer.Builder |
weightDecay(double coefficient,
boolean applyLR)
Add weight decay regularization for the network parameters (excluding biases).
|
AbstractSameDiffLayer.Builder |
weightDecayBias(double coefficient)
Weight decay for the biases only - see
weightDecay(double) for more details. |
AbstractSameDiffLayer.Builder |
weightDecayBias(double coefficient,
boolean applyLR)
Weight decay for the biases only - see
weightDecay(double) for more details |
build, constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
protected List<Regularization> regularization
protected List<Regularization> regularizationBias
protected IUpdater updater
protected IUpdater biasUpdater
updater(IUpdater)
public T l1(double l1)
l1Bias(double)
to configure the l1 regularization
coefficient for the bias.public T l2(double l2)
l2Bias(double)
to configure the l2 regularization
coefficient for the bias.WeightDecay
(set via weightDecay(double,boolean)
should be preferred to
L2 regularization. See WeightDecay
javadoc for further details.public T l1Bias(double l1Bias)
l1(double)
public T l2Bias(double l2Bias)
l2(double)
WeightDecay
(set via weightDecayBias(double,boolean)
should be preferred to
L2 regularization. See WeightDecay
javadoc for further details.public AbstractSameDiffLayer.Builder weightDecay(double coefficient)
WeightDecay
for more details.coefficient
- Weight decay regularization coefficientweightDecay(double, boolean)
public AbstractSameDiffLayer.Builder weightDecay(double coefficient, boolean applyLR)
WeightDecay
for more details.coefficient
- Weight decay regularization coefficientapplyLR
- Whether the learning rate should be multiplied in when performing weight decay updates. See WeightDecay
for more details.weightDecay(double, boolean)
public AbstractSameDiffLayer.Builder weightDecayBias(double coefficient)
weightDecay(double)
for more details.
This applies weight decay with multiplying the learning rate.coefficient
- Weight decay regularization coefficientweightDecayBias(double, boolean)
public AbstractSameDiffLayer.Builder weightDecayBias(double coefficient, boolean applyLR)
weightDecay(double)
for more detailscoefficient
- Weight decay regularization coefficientpublic AbstractSameDiffLayer.Builder regularization(List<Regularization> regularization)
WeightDecay
regularization
- Regularization to apply for the network parameters/weights (excluding biases)public AbstractSameDiffLayer.Builder regularizationBias(List<Regularization> regularizationBias)
WeightDecay
regularizationBias
- Regularization to apply for the network biases onlypublic T biasUpdater(IUpdater biasUpdater)
updater(IUpdater)
biasUpdater
- Updater to use for bias parametersCopyright © 2019. All rights reserved.