Class SubsamplingLayer
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer
-
- org.deeplearning4j.nn.conf.layers.NoParamLayer
-
- org.deeplearning4j.nn.conf.layers.SubsamplingLayer
-
- All Implemented Interfaces:
Serializable,Cloneable,TrainingConfig
- Direct Known Subclasses:
Pooling2D,Subsampling1DLayer
public class SubsamplingLayer extends NoParamLayer
- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description protected static classSubsamplingLayer.BaseSubsamplingBuilder<T extends SubsamplingLayer.BaseSubsamplingBuilder<T>>static classSubsamplingLayer.Builderstatic classSubsamplingLayer.PoolingType
-
Field Summary
Fields Modifier and Type Field Description protected booleanavgPoolIncludePadInDivisorprotected CNN2DFormatcnn2dDataFormatprotected ConvolutionModeconvolutionModeprotected booleancudnnAllowFallbackstatic CNN2DFormatDEFAULT_FORMATprotected int[]dilationprotected doubleepsprotected int[]kernelSizeprotected int[]paddingprotected intpnormprotected PoolingTypepoolingTypeprotected int[]stride-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer
constraints, iDropout, layerName
-
-
Constructor Summary
Constructors Modifier Constructor Description protectedSubsamplingLayer(SubsamplingLayer.BaseSubsamplingBuilder builder)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description SubsamplingLayerclone()doublegetEps()LayerMemoryReportgetMemoryReport(InputType inputType)This is a report of the estimated memory consumption for the given layerInputTypegetOutputType(int layerIndex, InputType inputType)For a given type of input to this layer, what is the type of the output?intgetPnorm()InputPreProcessorgetPreProcessorForInputType(InputType inputType)For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriateInputPreProcessorfor this layer, such as aCnnToFeedForwardPreProcessorParamInitializerinitializer()Layerinstantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)booleanisPretrainParam(String paramName)Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.voidsetNIn(InputType inputType, boolean override)Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type-
Methods inherited from class org.deeplearning4j.nn.conf.layers.NoParamLayer
getGradientNormalization, getGradientNormalizationThreshold, getRegularizationByParam
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer
getUpdaterByParam, initializeConstraints, resetLayerDefaultConfig, setDataType
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.TrainingConfig
getLayerName
-
-
-
-
Field Detail
-
convolutionMode
protected ConvolutionMode convolutionMode
-
poolingType
protected PoolingType poolingType
-
kernelSize
protected int[] kernelSize
-
stride
protected int[] stride
-
padding
protected int[] padding
-
dilation
protected int[] dilation
-
pnorm
protected int pnorm
-
eps
protected double eps
-
cudnnAllowFallback
protected boolean cudnnAllowFallback
-
cnn2dDataFormat
protected CNN2DFormat cnn2dDataFormat
-
DEFAULT_FORMAT
public static final CNN2DFormat DEFAULT_FORMAT
-
avgPoolIncludePadInDivisor
protected boolean avgPoolIncludePadInDivisor
-
-
Constructor Detail
-
SubsamplingLayer
protected SubsamplingLayer(SubsamplingLayer.BaseSubsamplingBuilder builder)
-
-
Method Detail
-
clone
public SubsamplingLayer clone()
-
instantiate
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
- Specified by:
instantiatein classLayer
-
initializer
public ParamInitializer initializer()
- Overrides:
initializerin classNoParamLayer- Returns:
- The parameter initializer for this model
-
getOutputType
public InputType getOutputType(int layerIndex, InputType inputType)
Description copied from class:LayerFor a given type of input to this layer, what is the type of the output?- Specified by:
getOutputTypein classLayer- Parameters:
layerIndex- Index of the layerinputType- Type of input for the layer- Returns:
- Type of output from the layer
-
setNIn
public void setNIn(InputType inputType, boolean override)
Description copied from class:LayerSet the nIn value (number of inputs, or input channels for CNNs) based on the given input type- Overrides:
setNInin classNoParamLayer- Parameters:
inputType- Input type for this layeroverride- If false: only set the nIn value if it's not already set. If true: set it regardless of whether it's already set or not.
-
getPreProcessorForInputType
public InputPreProcessor getPreProcessorForInputType(InputType inputType)
Description copied from class:LayerFor the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriateInputPreProcessorfor this layer, such as aCnnToFeedForwardPreProcessor- Specified by:
getPreProcessorForInputTypein classLayer- Parameters:
inputType- InputType to this layer- Returns:
- Null if no preprocessor is required, otherwise the type of preprocessor necessary for this layer/input combination
-
isPretrainParam
public boolean isPretrainParam(String paramName)
Description copied from class:LayerIs the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.- Specified by:
isPretrainParamin interfaceTrainingConfig- Overrides:
isPretrainParamin classNoParamLayer- Parameters:
paramName- Parameter name/key- Returns:
- True if the parameter is for layerwise pretraining only, false otherwise
-
getMemoryReport
public LayerMemoryReport getMemoryReport(InputType inputType)
Description copied from class:LayerThis is a report of the estimated memory consumption for the given layer- Specified by:
getMemoryReportin classLayer- Parameters:
inputType- Input type to the layer. Memory consumption is often a function of the input type- Returns:
- Memory report for the layer
-
getPnorm
public int getPnorm()
-
getEps
public double getEps()
-
-