public class SubsamplingLayer extends AbstractLayer<SubsamplingLayer>
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected ConvolutionMode |
convolutionMode |
static String |
CUDNN_SUBSAMPLING_HELPER_CLASS_NAME |
protected SubsamplingHelper |
helper |
protected int |
helperCountFail |
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
SubsamplingLayer(NeuralNetConfiguration conf,
DataType dataType) |
Modifier and Type | Method and Description |
---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcRegularizationScore(boolean backpropOnlyParams)
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss function |
void |
clearNoiseWeightParams() |
Pair<INDArray,MaskState> |
feedForwardMaskArray(INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in the layer as appropriate.
|
void |
fit()
All models have a fit method
|
void |
fit(INDArray input,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
LayerHelper |
getHelper() |
INDArray |
getParam(String param)
Get the parameter
|
Gradient |
gradient()
Get the gradient.
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
long |
numParams()
The number of parameters for the model
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
double |
score()
The score for the model
|
void |
setParams(INDArray params)
Set the parameters for this model.
|
Layer.Type |
type()
Returns the layer type
|
void |
update(INDArray gradient,
String paramType)
Perform one update applying the gradient
|
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, clear, close, computeGradientAndScore, conf, getConfig, getEpochCount, getGradientsViewArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradientAndScore, init, input, layerConf, layerId, numParams, paramTable, paramTable, setBackpropGradientsViewArray, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, update, updaterDivideByMinibatch
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getIterationCount, setIterationCount
protected SubsamplingHelper helper
protected int helperCountFail
protected ConvolutionMode convolutionMode
public static final String CUDNN_SUBSAMPLING_HELPER_CLASS_NAME
public SubsamplingLayer(NeuralNetConfiguration conf, DataType dataType)
public double calcRegularizationScore(boolean backpropOnlyParams)
Layer
calcRegularizationScore
in interface Layer
calcRegularizationScore
in class AbstractLayer<SubsamplingLayer>
backpropOnlyParams
- If true: calculate regularization score based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layer
type
in interface Layer
type
in class AbstractLayer<SubsamplingLayer>
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic boolean isPretrainLayer()
Layer
public void clearNoiseWeightParams()
public LayerHelper getHelper()
getHelper
in interface Layer
getHelper
in class AbstractLayer<SubsamplingLayer>
public Gradient gradient()
Model
Model.computeGradientAndScore(LayerWorkspaceMgr)
} .gradient
in interface Model
gradient
in class AbstractLayer<SubsamplingLayer>
public void fit()
Model
fit
in interface Model
fit
in class AbstractLayer<SubsamplingLayer>
public long numParams()
AbstractLayer
numParams
in interface Model
numParams
in interface Trainable
numParams
in class AbstractLayer<SubsamplingLayer>
public void fit(INDArray input, LayerWorkspaceMgr workspaceMgr)
Model
fit
in interface Model
fit
in class AbstractLayer<SubsamplingLayer>
input
- the data to fit the model topublic double score()
Model
score
in interface Model
score
in class AbstractLayer<SubsamplingLayer>
public void update(INDArray gradient, String paramType)
Model
update
in interface Model
update
in class AbstractLayer<SubsamplingLayer>
gradient
- the gradient to applypublic INDArray params()
AbstractLayer
params
in interface Model
params
in interface Trainable
params
in class AbstractLayer<SubsamplingLayer>
public INDArray getParam(String param)
Model
getParam
in interface Model
getParam
in class AbstractLayer<SubsamplingLayer>
param
- the key of the parameterpublic void setParams(INDArray params)
Model
setParams
in interface Model
setParams
in class AbstractLayer<SubsamplingLayer>
params
- the parameters for the modelpublic Pair<INDArray,MaskState> feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState, int minibatchSize)
Layer
feedForwardMaskArray
in interface Layer
feedForwardMaskArray
in class AbstractLayer<SubsamplingLayer>
maskArray
- Mask array to setcurrentMaskState
- Current state of the mask - see MaskState
minibatchSize
- Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)Copyright © 2021. All rights reserved.