public class SubsamplingLayer extends AbstractLayer<SubsamplingLayer>
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected ConvolutionMode |
convolutionMode |
protected SubsamplingHelper |
helper |
cacheMode, conf, dropoutApplied, dropoutMask, index, input, iterationListeners, maskArray, maskState, preOutput| Constructor and Description |
|---|
SubsamplingLayer(NeuralNetConfiguration conf) |
SubsamplingLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
void |
accumulateScore(double accum)
Sets a rolling tally for the score.
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activationMean()
Calculate the mean representation
for the activation for this layer
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
Gradient |
calcGradient(Gradient layerError,
org.nd4j.linalg.api.ndarray.INDArray indArray)
Calculate the gradient
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
Layer |
clone()
Clone the layer
|
void |
computeGradientAndScore()
Update the score
|
Gradient |
error(org.nd4j.linalg.api.ndarray.INDArray input)
Calculate error with respect to the
current layer.
|
void |
fit()
All models have a fit method
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray input)
Fit the model to the given data
|
org.nd4j.linalg.api.ndarray.INDArray |
getParam(String param)
Get the parameter
|
Gradient |
gradient()
Calculate a gradient
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (VAE, RBMs etc)
|
void |
iterate(org.nd4j.linalg.api.ndarray.INDArray input)
iterate one iteration of the network
|
void |
merge(Layer layer,
int batchSize)
Averages the given logistic regression from a mini batch into this layer
|
int |
numParams()
The number of parameters for the model
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(boolean training) |
double |
score()
The score for the model
|
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
void |
update(org.nd4j.linalg.api.ndarray.INDArray gradient,
String paramType)
Perform one update applying the gradient
|
activate, activate, activate, activate, activate, addListeners, applyDropOutIfNecessary, applyLearningRateScoreDecay, applyMask, batchSize, clear, conf, derivativeActivation, feedForwardMaskArray, getGradientsViewArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradientAndScore, init, initParams, input, layerConf, layerId, numParams, paramTable, paramTable, preOutput, preOutput, preOutput, setBackpropGradientsViewArray, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, update, validateInputprotected SubsamplingHelper helper
protected ConvolutionMode convolutionMode
public SubsamplingLayer(NeuralNetConfiguration conf)
public SubsamplingLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public double calcL2(boolean backpropParamsOnly)
LayercalcL2 in interface LayercalcL2 in class AbstractLayer<SubsamplingLayer>backpropParamsOnly - If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
LayercalcL1 in interface LayercalcL1 in class AbstractLayer<SubsamplingLayer>backpropParamsOnly - If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layertype in interface Layertype in class AbstractLayer<SubsamplingLayer>public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layertraining - training or test modepublic Gradient error(org.nd4j.linalg.api.ndarray.INDArray input)
Layererror in interface Layererror in class AbstractLayer<SubsamplingLayer>input - the gradient for the forward layer
If this is the final layer, it will start
with the error from the output.
This is on the user to initialize.public Gradient calcGradient(Gradient layerError, org.nd4j.linalg.api.ndarray.INDArray indArray)
LayercalcGradient in interface LayercalcGradient in class AbstractLayer<SubsamplingLayer>layerError - the layer errorpublic void merge(Layer layer, int batchSize)
AbstractLayermerge in interface Layermerge in class AbstractLayer<SubsamplingLayer>layer - the logistic regression layer to average into this layerbatchSize - the batch sizepublic org.nd4j.linalg.api.ndarray.INDArray activationMean()
Layerpublic Layer transpose()
Layertranspose in interface Layertranspose in class AbstractLayer<SubsamplingLayer>public Layer clone()
Layerclone in interface Layerclone in class AbstractLayer<SubsamplingLayer>public boolean isPretrainLayer()
Layerpublic void iterate(org.nd4j.linalg.api.ndarray.INDArray input)
AbstractLayeriterate in interface Modeliterate in class AbstractLayer<SubsamplingLayer>input - the input to iterate onpublic Gradient gradient()
Modelgradient in interface Modelgradient in class AbstractLayer<SubsamplingLayer>public void fit()
Modelfit in interface Modelfit in class AbstractLayer<SubsamplingLayer>public int numParams()
AbstractLayernumParams in interface ModelnumParams in class AbstractLayer<SubsamplingLayer>public void fit(org.nd4j.linalg.api.ndarray.INDArray input)
Modelfit in interface Modelfit in class AbstractLayer<SubsamplingLayer>input - the data to fit the model topublic void computeGradientAndScore()
ModelcomputeGradientAndScore in interface ModelcomputeGradientAndScore in class AbstractLayer<SubsamplingLayer>public double score()
Modelscore in interface Modelscore in class AbstractLayer<SubsamplingLayer>public void accumulateScore(double accum)
ModelaccumulateScore in interface ModelaccumulateScore in class AbstractLayer<SubsamplingLayer>accum - the amount to accumpublic void update(org.nd4j.linalg.api.ndarray.INDArray gradient,
String paramType)
Modelupdate in interface Modelupdate in class AbstractLayer<SubsamplingLayer>gradient - the gradient to applypublic org.nd4j.linalg.api.ndarray.INDArray params()
AbstractLayerparams in interface Modelparams in class AbstractLayer<SubsamplingLayer>public org.nd4j.linalg.api.ndarray.INDArray getParam(String param)
ModelgetParam in interface ModelgetParam in class AbstractLayer<SubsamplingLayer>param - the key of the parameterpublic void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
ModelsetParams in interface ModelsetParams in class AbstractLayer<SubsamplingLayer>params - the parameters for the modelpublic org.nd4j.linalg.api.ndarray.INDArray preOutput(boolean training)
preOutput in class AbstractLayer<SubsamplingLayer>Copyright © 2017. All rights reserved.