public abstract class BaseUpsamplingLayer extends NoParamLayer
Modifier and Type | Class and Description |
---|---|
protected static class |
BaseUpsamplingLayer.UpsamplingBuilder<T extends BaseUpsamplingLayer.UpsamplingBuilder<T>> |
Layer.Builder<T extends Layer.Builder<T>>
Modifier and Type | Field and Description |
---|---|
protected int[] |
size |
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
BaseUpsamplingLayer(BaseUpsamplingLayer.UpsamplingBuilder builder) |
Modifier and Type | Method and Description |
---|---|
BaseUpsamplingLayer |
clone() |
double |
getL1ByParam(String paramName)
Get the L1 coefficient for the given parameter.
|
double |
getL2ByParam(String paramName)
Get the L2 coefficient for the given parameter.
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor |
ParamInitializer |
initializer() |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type
|
getGradientNormalization, getGradientNormalizationThreshold, isPretrain
getMemoryReport, getOutputType, getUpdaterByParam, initializeConstraints, instantiate, resetLayerDefaultConfig, setPretrain
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getLayerName
protected BaseUpsamplingLayer(BaseUpsamplingLayer.UpsamplingBuilder builder)
public BaseUpsamplingLayer clone()
public ParamInitializer initializer()
initializer
in class NoParamLayer
public void setNIn(InputType inputType, boolean override)
Layer
setNIn
in class NoParamLayer
inputType
- Input type for this layeroverride
- If false: only set the nIn value if it's not already set. If true: set it regardless of whether it's
already set or not.public InputPreProcessor getPreProcessorForInputType(InputType inputType)
Layer
InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor
getPreProcessorForInputType
in class Layer
inputType
- InputType to this layerpublic double getL1ByParam(String paramName)
Layer
getL1ByParam
in interface TrainingConfig
getL1ByParam
in class NoParamLayer
paramName
- Parameter namepublic double getL2ByParam(String paramName)
Layer
getL2ByParam
in interface TrainingConfig
getL2ByParam
in class NoParamLayer
paramName
- Parameter namepublic boolean isPretrainParam(String paramName)
Layer
isPretrainParam
in interface TrainingConfig
isPretrainParam
in class NoParamLayer
paramName
- Parameter name/keyCopyright © 2018. All rights reserved.