public class BatchNormalization extends BaseLayer<BatchNormalization>
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected int |
helperCountFail |
protected int |
index |
protected List<TrainingListener> |
listeners |
protected INDArray |
std |
protected INDArray |
xHat |
protected INDArray |
xMu |
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParamscacheMode, conf, dropoutApplied, epochCount, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
BatchNormalization(NeuralNetConfiguration conf) |
| Modifier and Type | Method and Description |
|---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
fit(INDArray input,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
LayerHelper |
getHelper() |
int |
getIndex()
Get the layer index.
|
Collection<TrainingListener> |
getListeners()
Get the iteration listeners for this layer.
|
long[] |
getShape(INDArray x) |
Gradient |
gradient()
Get the gradient.
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
INDArray |
preOutput(INDArray x,
Layer.TrainingMode training,
LayerWorkspaceMgr workspaceMgr) |
void |
setIndex(int index)
Set the layer index.
|
void |
setListeners(TrainingListener... listeners)
Set the iteration listeners for this layer.
|
Layer.Type |
type()
Returns the layer type
|
boolean |
updaterDivideByMinibatch(String paramName)
DL4J layers typically produce the sum of the gradients during the backward pass for each layer, and if required
(if minibatch=true) then divide by the minibatch size.
However, there are some exceptions, such as the batch norm mean/variance estimate parameters: these "gradients" are actually not gradients, but are updates to be applied directly to the parameter vector. |
clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, hasBias, layerConf, numParams, params, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, updateactivate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, conf, feedForwardMaskArray, getConfig, getEpochCount, getInput, getInputMiniBatchSize, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setInput, setInputMiniBatchSize, setListeners, setMaskArrayequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitgetIterationCount, setIterationCountprotected int helperCountFail
protected int index
protected List<TrainingListener> listeners
protected INDArray std
protected INDArray xMu
protected INDArray xHat
public BatchNormalization(NeuralNetConfiguration conf)
public double calcL2(boolean backpropParamsOnly)
LayercalcL2 in interface LayercalcL2 in class BaseLayer<BatchNormalization>backpropParamsOnly - If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
LayercalcL1 in interface LayercalcL1 in class BaseLayer<BatchNormalization>backpropParamsOnly - If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layertype in interface Layertype in class AbstractLayer<BatchNormalization>public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseLayer<BatchNormalization>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic void fit(INDArray input, LayerWorkspaceMgr workspaceMgr)
Modelfit in interface Modelfit in class BaseLayer<BatchNormalization>input - the data to fit the model topublic INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layeractivate in class BaseLayer<BatchNormalization>training - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic Gradient gradient()
ModelModel.computeGradientAndScore(LayerWorkspaceMgr) } .gradient in interface Modelgradient in class BaseLayer<BatchNormalization>public INDArray preOutput(INDArray x, Layer.TrainingMode training, LayerWorkspaceMgr workspaceMgr)
public Collection<TrainingListener> getListeners()
LayergetListeners in interface LayergetListeners in class AbstractLayer<BatchNormalization>public void setListeners(TrainingListener... listeners)
LayersetListeners in interface LayersetListeners in interface ModelsetListeners in class AbstractLayer<BatchNormalization>public void setIndex(int index)
LayersetIndex in interface LayersetIndex in class AbstractLayer<BatchNormalization>public int getIndex()
LayergetIndex in interface LayergetIndex in class AbstractLayer<BatchNormalization>public boolean isPretrainLayer()
Layerpublic LayerHelper getHelper()
getHelper in interface LayergetHelper in class AbstractLayer<BatchNormalization>public long[] getShape(INDArray x)
public boolean updaterDivideByMinibatch(String paramName)
TrainableupdaterDivideByMinibatch in interface TrainableupdaterDivideByMinibatch in class AbstractLayer<BatchNormalization>paramName - Name of the parameterCopyright © 2018. All rights reserved.