public abstract class NoParamLayer extends Layer
Layer.Builder<T extends Layer.Builder<T>>
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
NoParamLayer(Layer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
GradientNormalization |
getGradientNormalization() |
double |
getGradientNormalizationThreshold() |
List<Regularization> |
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.
|
ParamInitializer |
initializer() |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
clone, getMemoryReport, getOutputType, getPreProcessorForInputType, getUpdaterByParam, initializeConstraints, instantiate, resetLayerDefaultConfig, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getLayerName
protected NoParamLayer(Layer.Builder builder)
public ParamInitializer initializer()
initializer
in class Layer
public void setNIn(InputType inputType, boolean override)
Layer
public List<Regularization> getRegularizationByParam(String paramName)
Layer
getRegularizationByParam
in interface TrainingConfig
getRegularizationByParam
in class Layer
paramName
- Parameter name ("W", "b" etc)public GradientNormalization getGradientNormalization()
public double getGradientNormalizationThreshold()
public boolean isPretrainParam(String paramName)
Layer
isPretrainParam
in interface TrainingConfig
isPretrainParam
in class Layer
paramName
- Parameter name/keyCopyright © 2022. All rights reserved.