Modifier and Type | Field and Description |
---|---|
protected IWeightInit |
NeuralNetConfiguration.Builder.weightInitFn |
Modifier and Type | Method and Description |
---|---|
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.weightInit(IWeightInit weightInit)
Weight initialization scheme to use, for initial weight values
Note: values set by this method will be applied to all applicable layers in the network, unless a different
value is explicitly set on a given layer.
|
Modifier and Type | Field and Description |
---|---|
protected IWeightInit |
BaseLayer.weightInitFn |
protected IWeightInit |
BaseLayer.Builder.weightInitFn
Weight initialization scheme to use, for initial weight values
|
protected IWeightInit |
BaseRecurrentLayer.weightInitFnRecurrent |
protected IWeightInit |
BaseRecurrentLayer.Builder.weightInitFnRecurrent
Set the weight initialization for the recurrent weights.
|
Modifier and Type | Method and Description |
---|---|
void |
EmbeddingSequenceLayer.Builder.setWeightInitFn(IWeightInit weightInit) |
EmbeddingLayer.Builder |
EmbeddingLayer.Builder.weightInit(IWeightInit weightInit) |
EmbeddingSequenceLayer.Builder |
EmbeddingSequenceLayer.Builder.weightInit(IWeightInit weightInit) |
T |
BaseLayer.Builder.weightInit(IWeightInit weightInit)
Weight initialization scheme to use, for initial weight values
|
T |
BaseRecurrentLayer.Builder.weightInitRecurrent(IWeightInit weightInit)
Set the weight initialization for the recurrent weights.
|
Modifier and Type | Field and Description |
---|---|
protected Map<String,IWeightInit> |
SameDiffLayer.paramWeightInit |
protected Map<String,IWeightInit> |
SameDiffLayer.Builder.paramWeightInit |
Modifier and Type | Method and Description |
---|---|
T |
SameDiffLayer.Builder.weightInit(@NonNull String param,
@NonNull IWeightInit weightInit) |
Modifier and Type | Method and Description |
---|---|
protected INDArray |
ElementWiseParamInitializer.createWeightMatrix(long nIn,
long nOut,
IWeightInit weightInit,
INDArray weightParamView,
boolean initializeParameters) |
protected INDArray |
EmbeddingLayerParamInitializer.createWeightMatrix(long nIn,
long nOut,
IWeightInit weightInit,
INDArray weightParamView,
boolean initializeParameters) |
protected INDArray |
DefaultParamInitializer.createWeightMatrix(long nIn,
long nOut,
IWeightInit weightInit,
INDArray weightParamView,
boolean initializeParameters) |
Modifier and Type | Field and Description |
---|---|
protected IWeightInit |
FineTuneConfiguration.weightInitFn |
Modifier and Type | Method and Description |
---|---|
TransferLearning.Builder |
TransferLearning.Builder.nInReplace(int layerNum,
int nIn,
IWeightInit scheme)
Modify the architecture of a vertex layer by changing nIn of the specified layer.
Note that only the specified layer will be modified - all other layers will not be changed by this call. |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nInReplace(String layerName,
int nIn,
IWeightInit scheme)
Modify the architecture of a vertex layer by changing nIn of the specified layer.
Note that only the specified layer will be modified - all other layers will not be changed by this call. |
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
IWeightInit scheme,
IWeightInit schemeNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightInit(IWeightInit weightInit)
Weight initialization scheme to use, for initial weight values
|
Modifier and Type | Class and Description |
---|---|
class |
WeightInitConstant |
class |
WeightInitDistribution |
class |
WeightInitIdentity |
class |
WeightInitLecunUniform
Uniform U[-a,a] with a=3/sqrt(fanIn).
|
class |
WeightInitNormal
Normal/Gaussian distribution, with mean 0 and standard deviation 1/sqrt(fanIn).
|
class |
WeightInitRelu |
class |
WeightInitReluUniform |
class |
WeightInitSigmoidUniform |
class |
WeightInitUniform |
class |
WeightInitVarScalingNormalFanAvg |
class |
WeightInitVarScalingNormalFanIn |
class |
WeightInitVarScalingNormalFanOut |
class |
WeightInitVarScalingUniformFanAvg
Uniform U[-a,a] with a=3.0/((fanIn + fanOut)/2)
|
class |
WeightInitVarScalingUniformFanIn |
class |
WeightInitVarScalingUniformFanOut |
class |
WeightInitXavier |
class |
WeightInitXavierLegacy
Xavier weight init in DL4J up to 0.6.0.
|
class |
WeightInitXavierUniform
As per Glorot and Bengio 2010: Uniform distribution U(-s,s) with s = sqrt(6/(fanIn + fanOut))
|
Modifier and Type | Method and Description |
---|---|
IWeightInit |
WeightInit.getWeightInitFunction()
Create an instance of the weight initialization function
|
IWeightInit |
WeightInit.getWeightInitFunction(Distribution distribution)
Create an instance of the weight initialization function
|
Modifier and Type | Class and Description |
---|---|
class |
WeightInitEmbedding |
Copyright © 2022. All rights reserved.