Package | Description |
---|---|
org.deeplearning4j.nn.conf.layers |
Modifier and Type | Method and Description |
---|---|
BatchNormalization.Builder |
BatchNormalization.Builder.beta(double beta)
Used only when 'true' is passed to
lockGammaBeta(boolean) . |
BatchNormalization.Builder |
BatchNormalization.Builder.constrainBeta(LayerConstraint... constraints)
Set constraints to be applied to the beta parameter of this batch normalisation layer.
|
BatchNormalization.Builder |
BatchNormalization.Builder.constrainGamma(LayerConstraint... constraints)
Set constraints to be applied to the gamma parameter of this batch normalisation layer.
|
BatchNormalization.Builder |
BatchNormalization.Builder.decay(double decay)
At test time: we can use a global estimate of the mean and variance, calculated using a moving average
of the batch means/variances.
|
BatchNormalization.Builder |
BatchNormalization.Builder.eps(double eps)
Epsilon value for batch normalization; small floating point value added to variance
(algorithm 1 in http://arxiv.org/pdf/1502.03167v3.pdf) to reduce/avoid underflow issues.
Default: 1e-5 |
BatchNormalization.Builder |
BatchNormalization.Builder.gamma(double gamma)
Used only when 'true' is passed to
lockGammaBeta(boolean) . |
BatchNormalization.Builder |
BatchNormalization.Builder.lockGammaBeta(boolean lockGammaBeta)
If set to true: lock the gamma and beta parameters to the values for each activation, specified by
gamma(double) and beta(double) . |
BatchNormalization.Builder |
BatchNormalization.Builder.minibatch(boolean minibatch)
If doing minibatch training or not.
|
Copyright © 2018. All rights reserved.