Class VariationalAutoencoder
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer
-
- org.deeplearning4j.nn.conf.layers.BaseLayer
-
- org.deeplearning4j.nn.conf.layers.FeedForwardLayer
-
- org.deeplearning4j.nn.conf.layers.BasePretrainNetwork
-
- org.deeplearning4j.nn.conf.layers.variational.VariationalAutoencoder
-
- All Implemented Interfaces:
Serializable,Cloneable,TrainingConfig
public class VariationalAutoencoder extends BasePretrainNetwork
- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static classVariationalAutoencoder.Builder
-
Field Summary
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.BasePretrainNetwork
lossFunction, visibleBiasInit
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.FeedForwardLayer
nIn, nOut, timeDistributedFormat
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.BaseLayer
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoise
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer
constraints, iDropout, layerName
-
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description LayerMemoryReportgetMemoryReport(InputType inputType)This is a report of the estimated memory consumption for the given layerParamInitializerinitializer()Layerinstantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)booleanisPretrainParam(String paramName)Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.-
Methods inherited from class org.deeplearning4j.nn.conf.layers.FeedForwardLayer
getOutputType, getPreProcessorForInputType, setNIn
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.BaseLayer
clone, getGradientNormalization, getRegularizationByParam, getUpdaterByParam, resetLayerDefaultConfig
-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer
initializeConstraints, setDataType
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.TrainingConfig
getGradientNormalizationThreshold, getLayerName
-
-
-
-
Method Detail
-
instantiate
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
- Specified by:
instantiatein classLayer
-
initializer
public ParamInitializer initializer()
- Specified by:
initializerin classLayer- Returns:
- The parameter initializer for this model
-
isPretrainParam
public boolean isPretrainParam(String paramName)
Description copied from class:LayerIs the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop.
Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs.- Specified by:
isPretrainParamin interfaceTrainingConfig- Overrides:
isPretrainParamin classBasePretrainNetwork- Parameters:
paramName- Parameter name/key- Returns:
- True if the parameter is for layerwise pretraining only, false otherwise
-
getMemoryReport
public LayerMemoryReport getMemoryReport(InputType inputType)
Description copied from class:LayerThis is a report of the estimated memory consumption for the given layer- Specified by:
getMemoryReportin classLayer- Parameters:
inputType- Input type to the layer. Memory consumption is often a function of the input type- Returns:
- Memory report for the layer
-
-