Class AbstractSameDiffLayer
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer
-
- org.deeplearning4j.nn.conf.layers.samediff.AbstractSameDiffLayer
-
- All Implemented Interfaces:
Serializable
,Cloneable
,TrainingConfig
- Direct Known Subclasses:
SameDiffLayer
,SameDiffOutputLayer
public abstract class AbstractSameDiffLayer extends Layer
- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static class
AbstractSameDiffLayer.Builder<T extends AbstractSameDiffLayer.Builder<T>>
-
Field Summary
Fields Modifier and Type Field Description protected IUpdater
biasUpdater
protected GradientNormalization
gradientNormalization
protected double
gradientNormalizationThreshold
protected List<Regularization>
regularization
protected List<Regularization>
regularizationBias
protected IUpdater
updater
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer
constraints, iDropout, layerName
-
-
Constructor Summary
Constructors Modifier Constructor Description protected
AbstractSameDiffLayer()
protected
AbstractSameDiffLayer(AbstractSameDiffLayer.Builder builder)
-
Method Summary
All Methods Instance Methods Abstract Methods Concrete Methods Modifier and Type Method Description void
applyGlobalConfig(NeuralNetConfiguration.Builder b)
void
applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig)
abstract void
defineParameters(SDLayerParams params)
Define the parameters for the network.SDLayerParams
getLayerParams()
LayerMemoryReport
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layerInputPreProcessor
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriateInputPreProcessor
for this layer, such as aCnnToFeedForwardPreProcessor
List<Regularization>
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.IUpdater
getUpdaterByParam(String paramName)
Get the updater for the given parameter.abstract void
initializeParameters(Map<String,INDArray> params)
Set the initial parameter values for this layer, if requiredParamInitializer
initializer()
protected void
initWeights(int fanIn, int fanOut, WeightInit weightInit, INDArray array)
abstract Layer
instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
boolean
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.INDArray
onesMaskForInput(INDArray input)
This method generates an "all ones" mask array for use in the SameDiff model when none is provided.char
paramReshapeOrder(String param)
Returns the memory layout ('c' or 'f' order - i.e., row/column major) of the parameters.void
setNIn(InputType inputType, boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer
clone, getOutputType, initializeConstraints, resetLayerDefaultConfig, setDataType
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.TrainingConfig
getGradientNormalization, getGradientNormalizationThreshold, getLayerName
-
-
-
-
Field Detail
-
regularization
protected List<Regularization> regularization
-
regularizationBias
protected List<Regularization> regularizationBias
-
updater
protected IUpdater updater
-
biasUpdater
protected IUpdater biasUpdater
-
gradientNormalization
protected GradientNormalization gradientNormalization
-
gradientNormalizationThreshold
protected double gradientNormalizationThreshold
-
-
Constructor Detail
-
AbstractSameDiffLayer
protected AbstractSameDiffLayer(AbstractSameDiffLayer.Builder builder)
-
AbstractSameDiffLayer
protected AbstractSameDiffLayer()
-
-
Method Detail
-
getRegularizationByParam
public List<Regularization> getRegularizationByParam(String paramName)
Description copied from class:Layer
Get the regularization types (l1/l2/weight decay) for the given parameter. Different parameters may have different regularization types.- Specified by:
getRegularizationByParam
in interfaceTrainingConfig
- Specified by:
getRegularizationByParam
in classLayer
- Parameters:
paramName
- Parameter name ("W", "b" etc)- Returns:
- Regularization types (if any) for the specified parameter
-
getLayerParams
public SDLayerParams getLayerParams()
-
setNIn
public void setNIn(InputType inputType, boolean override)
Description copied from class:Layer
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type
-
getPreProcessorForInputType
public InputPreProcessor getPreProcessorForInputType(InputType inputType)
Description copied from class:Layer
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriateInputPreProcessor
for this layer, such as aCnnToFeedForwardPreProcessor
- Specified by:
getPreProcessorForInputType
in classLayer
- Parameters:
inputType
- InputType to this layer- Returns:
- Null if no preprocessor is required, otherwise the type of preprocessor necessary for this layer/input combination
-
applyGlobalConfigToLayer
public void applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig)
-
defineParameters
public abstract void defineParameters(SDLayerParams params)
Define the parameters for the network. UseSDLayerParams.addWeightParam(String, long...)
andSDLayerParams.addBiasParam(String, long...)
- Parameters:
params
- Object used to set parameters for this layer
-
initializeParameters
public abstract void initializeParameters(Map<String,INDArray> params)
Set the initial parameter values for this layer, if required- Parameters:
params
- Parameter arrays that may be initialized
-
instantiate
public abstract Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
- Specified by:
instantiate
in classLayer
-
initializer
public ParamInitializer initializer()
- Specified by:
initializer
in classLayer
- Returns:
- The parameter initializer for this model
-
getUpdaterByParam
public IUpdater getUpdaterByParam(String paramName)
Description copied from class:Layer
Get the updater for the given parameter. Typically the same updater will be used for all updaters, but this is not necessarily the case- Specified by:
getUpdaterByParam
in interfaceTrainingConfig
- Overrides:
getUpdaterByParam
in classLayer
- Parameters:
paramName
- Parameter name- Returns:
- IUpdater for the parameter
-
isPretrainParam
public boolean isPretrainParam(String paramName)
Description copied from class:Layer
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.- Specified by:
isPretrainParam
in interfaceTrainingConfig
- Specified by:
isPretrainParam
in classLayer
- Parameters:
paramName
- Parameter name/key- Returns:
- True if the parameter is for layerwise pretraining only, false otherwise
-
getMemoryReport
public LayerMemoryReport getMemoryReport(InputType inputType)
Description copied from class:Layer
This is a report of the estimated memory consumption for the given layer- Specified by:
getMemoryReport
in classLayer
- Parameters:
inputType
- Input type to the layer. Memory consumption is often a function of the input type- Returns:
- Memory report for the layer
-
paramReshapeOrder
public char paramReshapeOrder(String param)
Returns the memory layout ('c' or 'f' order - i.e., row/column major) of the parameters. In most cases, this can/should be left- Parameters:
param
- Name of the parameter- Returns:
- Memory layout ('c' or 'f') of the parameter
-
initWeights
protected void initWeights(int fanIn, int fanOut, WeightInit weightInit, INDArray array)
-
applyGlobalConfig
public void applyGlobalConfig(NeuralNetConfiguration.Builder b)
-
-