public class LocalResponseNormalization extends AbstractLayer<LocalResponseNormalization>
For a^i_{x,y} the activity of a neuron computed by applying kernel i at position (x,y) and applying ReLU nonlinearity, the response normalized activation b^i_{x,y} is given by:
x^2 = (a^j_{x,y})^2 unitScale = (k + alpha * sum_{j=max(0, i - n/2)}^{max(N-1, i + n/2)} (a^j_{x,y})^2 ) y = b^i_{x,y} = x * unitScale**-beta
gy = epsilon (aka deltas from previous layer) sumPart = sum(a^j_{x,y} * gb^j_{x,y}) gx = gy * unitScale**-beta - 2 * alpha * beta * sumPart/unitScale * a^i_{x,y}
Reference: http://www.cs.toronto.edu/~fritz/absps/imagenet.pdf https://github.com/vlfeat/matconvnet/issues/10 Chainer
Created by nyghtowl on 10/29/15.
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected static org.slf4j.Logger |
log |
cacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, iterationListeners, maskArray, maskState, preOutput
Constructor and Description |
---|
LocalResponseNormalization(NeuralNetConfiguration conf) |
LocalResponseNormalization(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
clearNoiseWeightParams() |
Layer |
clone()
Clone the layer
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray input)
Fit the model to the given data
|
org.nd4j.linalg.api.ndarray.INDArray |
getParam(String param)
Get the parameter
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(boolean training) |
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
accumulateScore, activate, activate, activate, activate, activate, addListeners, applyConstraints, applyDropOutIfNecessary, applyMask, batchSize, clear, computeGradientAndScore, conf, feedForwardMaskArray, fit, getGradientsViewArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, initParams, input, iterate, layerConf, layerId, migrateInput, numParams, numParams, paramTable, paramTable, preOutput, preOutput, preOutput, score, setBackpropGradientsViewArray, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getEpochCount, getIterationCount, setEpochCount, setIterationCount
public LocalResponseNormalization(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public LocalResponseNormalization(NeuralNetConfiguration conf)
public Layer clone()
Layer
clone
in interface Layer
clone
in class AbstractLayer<LocalResponseNormalization>
public double calcL2(boolean backpropParamsOnly)
Layer
calcL2
in interface Layer
calcL2
in class AbstractLayer<LocalResponseNormalization>
backpropParamsOnly
- If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
Layer
calcL1
in interface Layer
calcL1
in class AbstractLayer<LocalResponseNormalization>
backpropParamsOnly
- If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layer
type
in interface Layer
type
in class AbstractLayer<LocalResponseNormalization>
public void fit(org.nd4j.linalg.api.ndarray.INDArray input)
Model
fit
in interface Model
fit
in class AbstractLayer<LocalResponseNormalization>
input
- the data to fit the model topublic org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layer
training
- training or test modepublic Layer transpose()
Layer
transpose
in interface Layer
transpose
in class AbstractLayer<LocalResponseNormalization>
public boolean isPretrainLayer()
Layer
public void clearNoiseWeightParams()
public org.nd4j.linalg.api.ndarray.INDArray params()
AbstractLayer
params
in interface Model
params
in class AbstractLayer<LocalResponseNormalization>
public org.nd4j.linalg.api.ndarray.INDArray getParam(String param)
Model
getParam
in interface Model
getParam
in class AbstractLayer<LocalResponseNormalization>
param
- the key of the parameterpublic void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Model
setParams
in interface Model
setParams
in class AbstractLayer<LocalResponseNormalization>
params
- the parameters for the modelpublic org.nd4j.linalg.api.ndarray.INDArray preOutput(boolean training)
preOutput
in class AbstractLayer<LocalResponseNormalization>
Copyright © 2018. All rights reserved.