Modifier and Type | Method and Description |
---|---|
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.dist(Distribution dist)
Deprecated.
|
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.weightInit(Distribution distribution)
Set weight initialization scheme to random sampling via the specified distribution.
|
Modifier and Type | Class and Description |
---|---|
class |
BinomialDistribution |
class |
ConstantDistribution |
class |
GaussianDistribution
Deprecated.
|
class |
LogNormalDistribution
A log-normal distribution, with two parameters: mean and standard deviation.
|
class |
NormalDistribution
A normal (Gaussian) distribution, with two parameters: mean and standard deviation
|
class |
OrthogonalDistribution
Orthogonal distribution, with gain parameter.
See https://arxiv.org/abs/1312.6120 for details |
class |
TruncatedNormalDistribution |
class |
UniformDistribution
A uniform distribution, with two parameters: lower and upper - i.e., U(lower,upper)
|
Modifier and Type | Method and Description |
---|---|
Distribution |
Distribution.clone() |
Modifier and Type | Method and Description |
---|---|
static Distribution |
Distributions.createDistribution(Distribution dist) |
Modifier and Type | Class and Description |
---|---|
class |
LegacyDistributionHelper |
Modifier and Type | Method and Description |
---|---|
Distribution |
LegacyDistributionDeserializer.deserialize(org.nd4j.shade.jackson.core.JsonParser jp,
org.nd4j.shade.jackson.databind.DeserializationContext deserializationContext) |
Modifier and Type | Method and Description |
---|---|
T |
BaseLayer.Builder.dist(Distribution dist)
Deprecated.
|
T |
BaseLayer.Builder.weightInit(Distribution distribution)
Set weight initialization scheme to random sampling via the specified distribution.
|
T |
BaseRecurrentLayer.Builder.weightInitRecurrent(Distribution dist)
Set the weight initialization for the recurrent weights, based on the specified distribution.
|
Constructor and Description |
---|
WeightNoise(Distribution distribution) |
WeightNoise(Distribution distribution,
boolean additive) |
WeightNoise(Distribution distribution,
boolean applyToBias,
boolean additive) |
Modifier and Type | Method and Description |
---|---|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.dist(Distribution dist)
Deprecated.
|
TransferLearning.Builder |
TransferLearning.Builder.nInReplace(int layerNum,
int nIn,
WeightInit scheme,
Distribution dist)
Modify the architecture of a vertex layer by changing nIn of the specified layer.
Note that only the specified layer will be modified - all other layers will not be changed by this call. |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nInReplace(String layerName,
int nIn,
WeightInit scheme,
Distribution dist)
Modify the architecture of a vertex layer by changing nIn of the specified layer.
Note that only the specified layer will be modified - all other layers will not be changed by this call. |
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist,
Distribution distNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist,
WeightInit schemeNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
WeightInit scheme,
Distribution distNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist)
Modify the architecture of a vertex layer by changing nOut
Note this will also affect the vertex layer that follows the layer specified, unless it is the output layer
Currently does not support modifying nOut of layers that feed into non-layer vertices like merge, subset etc
To modify nOut for such vertices use remove vertex, followed by add vertex
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist,
Distribution distNext)
Modified nOut of specified layer.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist,
WeightInit scheme) |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
WeightInit scheme,
Distribution dist) |
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightInit(Distribution distribution)
Set weight initialization scheme to random sampling via the specified distribution.
|
Modifier and Type | Method and Description |
---|---|
IWeightInit |
WeightInit.getWeightInitFunction(Distribution distribution)
Create an instance of the weight initialization function
|
Constructor and Description |
---|
WeightInitDistribution(Distribution distribution) |
Copyright © 2022. All rights reserved.