public abstract class NoParamLayer extends Layer
Layer.Builder<T extends Layer.Builder<T>>
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
NoParamLayer(Layer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
double |
getL1ByParam(String paramName)
Get the L1 coefficient for the given parameter.
|
double |
getL2ByParam(String paramName)
Get the L2 coefficient for the given parameter.
|
ParamInitializer |
initializer() |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type
|
clone, getMemoryReport, getOutputType, getPreProcessorForInputType, getUpdaterByParam, initializeConstraints, instantiate, resetLayerDefaultConfig
protected NoParamLayer(Layer.Builder builder)
public ParamInitializer initializer()
initializer
in class Layer
public void setNIn(InputType inputType, boolean override)
Layer
public double getL1ByParam(String paramName)
Layer
getL1ByParam
in class Layer
paramName
- Parameter namepublic double getL2ByParam(String paramName)
Layer
getL2ByParam
in class Layer
paramName
- Parameter namepublic boolean isPretrainParam(String paramName)
Layer
isPretrainParam
in class Layer
paramName
- Parameter name/keyCopyright © 2018. All rights reserved.