Modifier and Type | Field and Description |
---|---|
protected Distribution |
NeuralNetConfiguration.Builder.dist |
Modifier and Type | Method and Description |
---|---|
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.dist(Distribution dist)
Distribution to sample initial weights from.
|
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.weightInit(Distribution distribution)
Set weight initialization scheme to random sampling via the specified distribution.
|
Modifier and Type | Class and Description |
---|---|
class |
BinomialDistribution
A binomial distribution, with 2 parameters: number of trials, and probability of success
|
class |
ConstantDistribution
Constant distribution: a "distribution" where all values are set to the specified constant
|
class |
GaussianDistribution
Deprecated.
Use
NormalDistribution which is identical to this implementation |
class |
LogNormalDistribution
A log-normal distribution, with two parameters: mean and standard deviation.
|
class |
NormalDistribution
A normal (Gaussian) distribution, with two parameters: mean and standard deviation
|
class |
OrthogonalDistribution
Orthogonal distribution, with gain parameter.
See http://arxiv.org/abs/1312.6120 for details |
class |
TruncatedNormalDistribution
A truncated normal distribution, with 2 parameters: mean and standard deviation
This distribution is a standard normal/Gaussian distribtion, however values are "truncated" in the sense that any values that fall outside the range [mean - 2 * stdev, mean + 2 * stdev] are re-sampled. |
class |
UniformDistribution
A uniform distribution, with two parameters: lower and upper - i.e., U(lower,upper)
|
Modifier and Type | Method and Description |
---|---|
Distribution |
Distribution.clone() |
Modifier and Type | Method and Description |
---|---|
static Distribution |
Distributions.createDistribution(Distribution dist) |
Modifier and Type | Class and Description |
---|---|
class |
LegacyDistributionHelper
A dummy helper "distribution" for deserializing distributions in legacy/different JSON format.
|
Modifier and Type | Method and Description |
---|---|
Distribution |
LegacyDistributionDeserializer.deserialize(org.nd4j.shade.jackson.core.JsonParser jp,
org.nd4j.shade.jackson.databind.DeserializationContext deserializationContext) |
Modifier and Type | Field and Description |
---|---|
protected Distribution |
BaseLayer.dist |
protected Distribution |
BaseLayer.Builder.dist |
protected Distribution |
BaseRecurrentLayer.distRecurrent |
protected Distribution |
BaseRecurrentLayer.Builder.distRecurrent |
Modifier and Type | Method and Description |
---|---|
T |
BaseLayer.Builder.dist(Distribution dist)
Distribution to sample initial weights from.
|
static void |
LayerValidation.generalValidation(String layerName,
Layer layer,
IDropout iDropout,
double l2,
double l2Bias,
double l1,
double l1Bias,
Distribution dist,
List<LayerConstraint> allParamConstraints,
List<LayerConstraint> weightConstraints,
List<LayerConstraint> biasConstraints) |
static void |
LayerValidation.generalValidation(String layerName,
Layer layer,
IDropout iDropOut,
Double l2,
Double l2Bias,
Double l1,
Double l1Bias,
Distribution dist,
List<LayerConstraint> allParamConstraints,
List<LayerConstraint> weightConstraints,
List<LayerConstraint> biasConstraints) |
T |
BaseLayer.Builder.weightInit(Distribution distribution)
Set weight initialization scheme to random sampling via the specified distribution.
|
T |
BaseRecurrentLayer.Builder.weightInitRecurrent(Distribution dist)
Set the weight initialization for the recurrent weights, based on the specified distribution.
|
Constructor and Description |
---|
WeightNoise(Distribution distribution) |
WeightNoise(Distribution distribution,
boolean additive) |
WeightNoise(Distribution distribution,
boolean applyToBias,
boolean additive) |
Modifier and Type | Field and Description |
---|---|
protected Distribution |
FineTuneConfiguration.dist |
Modifier and Type | Method and Description |
---|---|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.dist(Distribution dist)
Distribution to sample initial weights from.
|
TransferLearning.Builder |
TransferLearning.Builder.nInReplace(int layerNum,
int nIn,
WeightInit scheme,
Distribution dist)
Modify the architecture of a vertex layer by changing nIn of the specified layer.
Note that only the specified layer will be modified - all other layers will not be changed by this call. |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nInReplace(String layerName,
int nIn,
WeightInit scheme,
Distribution dist)
Modify the architecture of a vertex layer by changing nIn of the specified layer.
Note that only the specified layer will be modified - all other layers will not be changed by this call. |
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist,
Distribution distNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist,
WeightInit schemeNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
WeightInit scheme,
Distribution distNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist)
Modify the architecture of a vertex layer by changing nOut
Note this will also affect the vertex layer that follows the layer specified, unless it is the output layer
Currently does not support modifying nOut of layers that feed into non-layer vertices like merge, subset etc
To modify nOut for such vertices use remove vertex, followed by add vertex
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist,
Distribution distNext)
Modified nOut of specified layer.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist,
WeightInit scheme) |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
WeightInit scheme,
Distribution dist) |
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightInit(Distribution distribution)
Set weight initialization scheme to random sampling via the specified distribution.
Equivalent to: .weightInit(WeightInit.DISTRIBUTION).dist(distribution) |
Copyright © 2018. All rights reserved.