public class DummyConfig extends Object implements TrainingConfig
Constructor and Description |
---|
DummyConfig() |
Modifier and Type | Method and Description |
---|---|
GradientNormalization |
getGradientNormalization() |
double |
getGradientNormalizationThreshold() |
String |
getLayerName() |
List<Regularization> |
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.
|
IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setDataType(org.nd4j.linalg.api.buffer.DataType dataType) |
public String getLayerName()
getLayerName
in interface TrainingConfig
public List<Regularization> getRegularizationByParam(String paramName)
TrainingConfig
getRegularizationByParam
in interface TrainingConfig
paramName
- Parameter name ("W", "b" etc)public boolean isPretrainParam(String paramName)
TrainingConfig
isPretrainParam
in interface TrainingConfig
paramName
- Parameter name/keypublic IUpdater getUpdaterByParam(String paramName)
TrainingConfig
getUpdaterByParam
in interface TrainingConfig
paramName
- Parameter namepublic GradientNormalization getGradientNormalization()
getGradientNormalization
in interface TrainingConfig
public double getGradientNormalizationThreshold()
getGradientNormalizationThreshold
in interface TrainingConfig
public void setDataType(org.nd4j.linalg.api.buffer.DataType dataType)
setDataType
in interface TrainingConfig
Copyright © 2019. All rights reserved.