public class SubsamplingLayer extends NoParamLayer
Modifier and Type | Class and Description |
---|---|
protected static class |
SubsamplingLayer.BaseSubsamplingBuilder<T extends SubsamplingLayer.BaseSubsamplingBuilder<T>> |
static class |
SubsamplingLayer.Builder |
static class |
SubsamplingLayer.PoolingType |
Modifier and Type | Field and Description |
---|---|
protected boolean |
avgPoolIncludePadInDivisor |
protected CNN2DFormat |
cnn2dDataFormat |
protected ConvolutionMode |
convolutionMode |
protected boolean |
cudnnAllowFallback |
static CNN2DFormat |
DEFAULT_FORMAT |
protected int[] |
dilation |
protected double |
eps |
protected int[] |
kernelSize |
protected int[] |
padding |
protected int |
pnorm |
protected PoolingType |
poolingType |
protected int[] |
stride |
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
SubsamplingLayer(SubsamplingLayer.BaseSubsamplingBuilder builder) |
Modifier and Type | Method and Description |
---|---|
SubsamplingLayer |
clone() |
double |
getEps() |
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
int |
getPnorm() |
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor for this layer, such as a CnnToFeedForwardPreProcessor |
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
DataType networkDataType) |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
getGradientNormalization, getGradientNormalizationThreshold, getRegularizationByParam
getUpdaterByParam, initializeConstraints, resetLayerDefaultConfig, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getLayerName
protected ConvolutionMode convolutionMode
protected PoolingType poolingType
protected int[] kernelSize
protected int[] stride
protected int[] padding
protected int[] dilation
protected int pnorm
protected double eps
protected boolean cudnnAllowFallback
protected CNN2DFormat cnn2dDataFormat
public static final CNN2DFormat DEFAULT_FORMAT
protected boolean avgPoolIncludePadInDivisor
protected SubsamplingLayer(SubsamplingLayer.BaseSubsamplingBuilder builder)
public SubsamplingLayer clone()
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
instantiate
in class Layer
public ParamInitializer initializer()
initializer
in class NoParamLayer
public InputType getOutputType(int layerIndex, InputType inputType)
Layer
getOutputType
in class Layer
layerIndex
- Index of the layerinputType
- Type of input for the layerpublic void setNIn(InputType inputType, boolean override)
Layer
setNIn
in class NoParamLayer
inputType
- Input type for this layeroverride
- If false: only set the nIn value if it's not already set. If true: set it
regardless of whether it's already set or not.public InputPreProcessor getPreProcessorForInputType(InputType inputType)
Layer
InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor
getPreProcessorForInputType
in class Layer
inputType
- InputType to this layerpublic boolean isPretrainParam(String paramName)
Layer
isPretrainParam
in interface TrainingConfig
isPretrainParam
in class NoParamLayer
paramName
- Parameter name/keypublic LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input
typepublic int getPnorm()
public double getEps()
Copyright © 2021. All rights reserved.