public class BatchNormalization extends BaseLayer<BatchNormalization>
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected int |
index |
protected List<TrainingListener> |
listeners |
protected org.nd4j.linalg.api.ndarray.INDArray |
std |
protected org.nd4j.linalg.api.ndarray.INDArray |
xHat |
protected org.nd4j.linalg.api.ndarray.INDArray |
xMu |
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParams
cacheMode, conf, dropoutApplied, dropoutMask, epochCount, input, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
BatchNormalization(NeuralNetConfiguration conf) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
Layer |
clone()
Clone the layer
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray input,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
int |
getIndex()
Get the layer index.
|
Collection<TrainingListener> |
getListeners()
Get the iteration listeners for this layer.
|
int[] |
getShape(org.nd4j.linalg.api.ndarray.INDArray x) |
Gradient |
gradient()
Get the gradient.
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
Layer.TrainingMode training,
LayerWorkspaceMgr workspaceMgr) |
void |
setIndex(int index)
Set the layer index.
|
void |
setListeners(TrainingListener... listeners)
Set the iteration listeners for this layer.
|
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
accumulateScore, clear, clearNoiseWeightParams, computeGradientAndScore, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, hasBias, initParams, layerConf, numParams, params, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, update
activate, addListeners, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, batchSize, conf, feedForwardMaskArray, getInput, getInputMiniBatchSize, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setInput, setInputMiniBatchSize, setListeners, setMaskArray, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
getEpochCount, getIterationCount, setEpochCount, setIterationCount
protected int index
protected List<TrainingListener> listeners
protected org.nd4j.linalg.api.ndarray.INDArray std
protected org.nd4j.linalg.api.ndarray.INDArray xMu
protected org.nd4j.linalg.api.ndarray.INDArray xHat
public BatchNormalization(NeuralNetConfiguration conf)
public double calcL2(boolean backpropParamsOnly)
Layer
calcL2
in interface Layer
calcL2
in class BaseLayer<BatchNormalization>
backpropParamsOnly
- If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
Layer
calcL1
in interface Layer
calcL1
in class BaseLayer<BatchNormalization>
backpropParamsOnly
- If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layer
type
in interface Layer
type
in class AbstractLayer<BatchNormalization>
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseLayer<BatchNormalization>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic void fit(org.nd4j.linalg.api.ndarray.INDArray input, LayerWorkspaceMgr workspaceMgr)
Model
fit
in interface Model
fit
in class BaseLayer<BatchNormalization>
input
- the data to fit the model topublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
activate
in interface Layer
activate
in class BaseLayer<BatchNormalization>
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic Gradient gradient()
Model
Model#computeGradientAndScore()
.gradient
in interface Model
gradient
in class BaseLayer<BatchNormalization>
public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x, Layer.TrainingMode training, LayerWorkspaceMgr workspaceMgr)
public Layer transpose()
Layer
transpose
in interface Layer
transpose
in class BaseLayer<BatchNormalization>
public Layer clone()
Layer
clone
in interface Layer
clone
in class BaseLayer<BatchNormalization>
public Collection<TrainingListener> getListeners()
Layer
getListeners
in interface Layer
getListeners
in class AbstractLayer<BatchNormalization>
public void setListeners(TrainingListener... listeners)
Layer
setListeners
in interface Layer
setListeners
in interface Model
setListeners
in class AbstractLayer<BatchNormalization>
public void setIndex(int index)
Layer
setIndex
in interface Layer
setIndex
in class AbstractLayer<BatchNormalization>
public int getIndex()
Layer
getIndex
in interface Layer
getIndex
in class AbstractLayer<BatchNormalization>
public boolean isPretrainLayer()
Layer
public int[] getShape(org.nd4j.linalg.api.ndarray.INDArray x)
Copyright © 2018. All rights reserved.