public class LocalResponseNormalization extends AbstractLayer<LocalResponseNormalization>
For a^i_{x,y} the activity of a neuron computed by applying kernel i at position (x,y) and applying ReLU nonlinearity, the response normalized activation b^i_{x,y} is given by:
x^2 = (a^j_{x,y})^2 unitScale = (k + alpha * sum_{j=max(0, i - n/2)}^{max(N-1, i + n/2)} (a^j_{x,y})^2 ) y = b^i_{x,y} = x * unitScale**-beta
gy = epsilon (aka deltas from previous layer) sumPart = sum(a^j_{x,y} * gb^j_{x,y}) gx = gy * unitScale**-beta - 2 * alpha * beta * sumPart/unitScale * a^i_{x,y}
Reference:
http://www.cs.toronto.edu/~fritz/absps/imagenet.pdf
https://github.com/vlfeat/matconvnet/issues/10
Created by nyghtowl on 10/29/15.
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected LocalResponseNormalizationHelper |
helper |
protected int |
helperCountFail |
protected static org.slf4j.Logger |
log |
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
LocalResponseNormalization(NeuralNetConfiguration conf,
org.nd4j.linalg.api.buffer.DataType dataType) |
Modifier and Type | Method and Description |
---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcRegularizationScore(boolean backpropParamsOnly)
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss function |
void |
clearNoiseWeightParams() |
Layer |
clone() |
void |
fit(INDArray input,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
LayerHelper |
getHelper() |
INDArray |
getParam(String param)
Get the parameter
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
void |
setParams(INDArray params)
Set the parameters for this model.
|
Layer.Type |
type()
Returns the layer type
|
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, clear, computeGradientAndScore, conf, feedForwardMaskArray, fit, getConfig, getEpochCount, getGradientsViewArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, input, layerConf, layerId, numParams, numParams, paramTable, paramTable, score, setBackpropGradientsViewArray, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, update, update, updaterDivideByMinibatch
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getIterationCount, setIterationCount
protected static final org.slf4j.Logger log
protected LocalResponseNormalizationHelper helper
protected int helperCountFail
public LocalResponseNormalization(NeuralNetConfiguration conf, org.nd4j.linalg.api.buffer.DataType dataType)
public double calcRegularizationScore(boolean backpropParamsOnly)
Layer
calcRegularizationScore
in interface Layer
calcRegularizationScore
in class AbstractLayer<LocalResponseNormalization>
backpropParamsOnly
- If true: calculate regularization score based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layer
type
in interface Layer
type
in class AbstractLayer<LocalResponseNormalization>
public void fit(INDArray input, LayerWorkspaceMgr workspaceMgr)
Model
fit
in interface Model
fit
in class AbstractLayer<LocalResponseNormalization>
input
- the data to fit the model topublic Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic boolean isPretrainLayer()
Layer
public void clearNoiseWeightParams()
public LayerHelper getHelper()
getHelper
in interface Layer
getHelper
in class AbstractLayer<LocalResponseNormalization>
public INDArray params()
AbstractLayer
params
in interface Model
params
in interface Trainable
params
in class AbstractLayer<LocalResponseNormalization>
public INDArray getParam(String param)
Model
getParam
in interface Model
getParam
in class AbstractLayer<LocalResponseNormalization>
param
- the key of the parameterpublic void setParams(INDArray params)
Model
setParams
in interface Model
setParams
in class AbstractLayer<LocalResponseNormalization>
params
- the parameters for the modelCopyright © 2019. All rights reserved.