public static class BatchNormalization.Builder extends FeedForwardLayer.Builder<BatchNormalization.Builder>
Modifier and Type | Field and Description |
---|---|
protected double |
beta
Used only when 'true' is passed to
lockGammaBeta(boolean) . |
protected List<LayerConstraint> |
betaConstraints
Set constraints to be applied to the beta parameter of this batch normalisation layer.
|
protected boolean |
cudnnAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed?
If set to false, an exception in CuDNN will be propagated back to the user.
|
protected double |
decay
At test time: we can use a global estimate of the mean and variance, calculated using a moving average of the
batch means/variances.
|
protected double |
eps
Epsilon value for batch normalization; small floating point value added to variance (algorithm 1 in http://arxiv.org/pdf/1502.03167v3.pdf) to reduce/avoid
underflow issues.
Default: 1e-5 |
protected double |
gamma
Used only when 'true' is passed to
lockGammaBeta(boolean) . |
protected List<LayerConstraint> |
gammaConstraints
Set constraints to be applied to the gamma parameter of this batch normalisation layer.
|
protected boolean |
isMinibatch
If doing minibatch training or not.
|
protected boolean |
lockGammaBeta
If set to true: lock the gamma and beta parameters to the values for each activation, specified by
gamma(double) and beta(double) . |
protected boolean |
useLogStd
How should the moving average of variance be stored? Two different parameterizations are supported.
|
nIn, nOut
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iupdater, regularization, regularizationBias, weightInitFn, weightNoise
allParamConstraints, biasConstraints, iDropout, layerName, weightConstraints
Constructor and Description |
---|
Builder() |
Builder(boolean lockGammaBeta) |
Builder(double decay,
boolean isMinibatch) |
Builder(double gamma,
double beta) |
Builder(double gamma,
double beta,
boolean lockGammaBeta) |
Modifier and Type | Method and Description |
---|---|
BatchNormalization.Builder |
beta(double beta)
Used only when 'true' is passed to
lockGammaBeta(boolean) . |
BatchNormalization |
build() |
BatchNormalization.Builder |
constrainBeta(LayerConstraint... constraints)
Set constraints to be applied to the beta parameter of this batch normalisation layer.
|
BatchNormalization.Builder |
constrainGamma(LayerConstraint... constraints)
Set constraints to be applied to the gamma parameter of this batch normalisation layer.
|
BatchNormalization.Builder |
cudnnAllowFallback(boolean allowFallback)
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed?
If set to false, an exception in CuDNN will be propagated back to the user.
|
BatchNormalization.Builder |
decay(double decay)
At test time: we can use a global estimate of the mean and variance, calculated using a moving average of the
batch means/variances.
|
BatchNormalization.Builder |
eps(double eps)
Epsilon value for batch normalization; small floating point value added to variance (algorithm 1 in http://arxiv.org/pdf/1502.03167v3.pdf) to reduce/avoid
underflow issues.
Default: 1e-5 |
BatchNormalization.Builder |
gamma(double gamma)
Used only when 'true' is passed to
lockGammaBeta(boolean) . |
BatchNormalization.Builder |
lockGammaBeta(boolean lockGammaBeta)
If set to true: lock the gamma and beta parameters to the values for each activation, specified by
gamma(double) and beta(double) . |
BatchNormalization.Builder |
minibatch(boolean minibatch)
If doing minibatch training or not.
|
BatchNormalization.Builder |
useLogStd(boolean useLogStd)
How should the moving average of variance be stored? Two different parameterizations are supported.
|
nIn, nIn, nOut, nOut, units
activation, activation, biasInit, biasUpdater, dist, gainInit, gradientNormalization, gradientNormalizationThreshold, l1, l1Bias, l2, l2Bias, regularization, regularizationBias, updater, updater, weightDecay, weightDecay, weightDecayBias, weightDecayBias, weightInit, weightInit, weightInit, weightNoise
constrainAllParameters, constrainBias, constrainWeights, dropOut, dropOut, name
protected double decay
protected double eps
protected boolean isMinibatch
protected boolean lockGammaBeta
gamma(double)
and beta(double)
. Default: false -> learn gamma and beta parameter values during
network training.protected double gamma
lockGammaBeta(boolean)
. Value is not used otherwise.protected double beta
lockGammaBeta(boolean)
. Value is not used otherwise.protected List<LayerConstraint> betaConstraints
protected List<LayerConstraint> gammaConstraints
protected boolean cudnnAllowFallback
protected boolean useLogStd
public Builder(double decay, boolean isMinibatch)
public Builder(double gamma, double beta)
public Builder(double gamma, double beta, boolean lockGammaBeta)
public Builder(boolean lockGammaBeta)
public Builder()
public BatchNormalization.Builder minibatch(boolean minibatch)
minibatch
- Minibatch parameterpublic BatchNormalization.Builder gamma(double gamma)
lockGammaBeta(boolean)
. Value is not used otherwise.gamma
- Gamma parameter for all activations, used only with locked gamma/beta configuration modepublic BatchNormalization.Builder beta(double beta)
lockGammaBeta(boolean)
. Value is not used otherwise.beta
- Beta parameter for all activations, used only with locked gamma/beta configuration modepublic BatchNormalization.Builder eps(double eps)
eps
- Epsilon values to usepublic BatchNormalization.Builder decay(double decay)
decay
- Decay value to use for global stats calculationpublic BatchNormalization.Builder lockGammaBeta(boolean lockGammaBeta)
gamma(double)
and beta(double)
. Default: false -> learn gamma and beta parameter values during
network training.lockGammaBeta
- If true: use fixed beta/gamma values. False: learn duringpublic BatchNormalization.Builder constrainBeta(LayerConstraint... constraints)
constraints
- Constraints to apply to the beta parameter of this layerpublic BatchNormalization.Builder constrainGamma(LayerConstraint... constraints)
constraints
- Constraints to apply to the gamma parameter of this layerpublic BatchNormalization.Builder cudnnAllowFallback(boolean allowFallback)
allowFallback
- Whether fallback to non-CuDNN implementation should be usedpublic BatchNormalization.Builder useLogStd(boolean useLogStd)
public BatchNormalization build()
build
in class Layer.Builder<BatchNormalization.Builder>
Copyright © 2019. All rights reserved.