public abstract class BaseWrapperLayer extends Layer
Layer.Builder<T extends Layer.Builder<T>>
Modifier and Type | Field and Description |
---|---|
protected Layer |
underlying |
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
BaseWrapperLayer() |
protected |
BaseWrapperLayer(Layer.Builder builder) |
|
BaseWrapperLayer(Layer underlying) |
Modifier and Type | Method and Description |
---|---|
GradientNormalization |
getGradientNormalization() |
double |
getGradientNormalizationThreshold() |
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor for this layer, such as a CnnToFeedForwardPreProcessor |
List<Regularization> |
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.
|
ParamInitializer |
initializer() |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setLayerName(String layerName) |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
clone, getUpdaterByParam, initializeConstraints, instantiate, resetLayerDefaultConfig, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getLayerName
protected Layer underlying
protected BaseWrapperLayer(Layer.Builder builder)
protected BaseWrapperLayer()
public BaseWrapperLayer(Layer underlying)
public ParamInitializer initializer()
initializer
in class Layer
public InputType getOutputType(int layerIndex, InputType inputType)
Layer
getOutputType
in class Layer
layerIndex
- Index of the layerinputType
- Type of input for the layerpublic void setNIn(InputType inputType, boolean override)
Layer
public InputPreProcessor getPreProcessorForInputType(InputType inputType)
Layer
InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor
getPreProcessorForInputType
in class Layer
inputType
- InputType to this layerpublic List<Regularization> getRegularizationByParam(String paramName)
Layer
getRegularizationByParam
in interface TrainingConfig
getRegularizationByParam
in class Layer
paramName
- Parameter name ("W", "b" etc)public GradientNormalization getGradientNormalization()
public double getGradientNormalizationThreshold()
public boolean isPretrainParam(String paramName)
Layer
isPretrainParam
in interface TrainingConfig
isPretrainParam
in class Layer
paramName
- Parameter name/keypublic LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input
typepublic void setLayerName(String layerName)
Copyright © 2022. All rights reserved.