Modifier and Type | Field and Description |
---|---|
protected IWeightInit |
NeuralNetConfiguration.Builder.weightInitFn |
Modifier and Type | Method and Description |
---|---|
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.weightInit(IWeightInit weightInit)
Weight initialization scheme to use, for initial weight values
Note: values set by this method will be applied to all applicable layers in the network, unless a different
value is explicitly set on a given layer.
|
Modifier and Type | Field and Description |
---|---|
protected IWeightInit |
BaseLayer.weightInitFn |
protected IWeightInit |
BaseLayer.Builder.weightInitFn
Weight initialization scheme to use, for initial weight values
|
protected IWeightInit |
BaseRecurrentLayer.weightInitFnRecurrent |
protected IWeightInit |
BaseRecurrentLayer.Builder.weightInitFnRecurrent
Set the weight initialization for the recurrent weights.
|
Modifier and Type | Method and Description |
---|---|
void |
EmbeddingSequenceLayer.Builder.setWeightInitFn(IWeightInit weightInit) |
EmbeddingSequenceLayer.Builder |
EmbeddingSequenceLayer.Builder.weightInit(IWeightInit weightInit) |
EmbeddingLayer.Builder |
EmbeddingLayer.Builder.weightInit(IWeightInit weightInit) |
T |
BaseLayer.Builder.weightInit(IWeightInit weightInit)
Weight initialization scheme to use, for initial weight values
|
T |
BaseRecurrentLayer.Builder.weightInitRecurrent(IWeightInit weightInit)
Set the weight initialization for the recurrent weights.
|
Modifier and Type | Field and Description |
---|---|
protected Map<String,IWeightInit> |
SameDiffLayer.paramWeightInit |
protected Map<String,IWeightInit> |
SameDiffLayer.Builder.paramWeightInit |
Modifier and Type | Method and Description |
---|---|
T |
SameDiffLayer.Builder.weightInit(String param,
IWeightInit weightInit) |
Modifier and Type | Method and Description |
---|---|
protected INDArray |
DefaultParamInitializer.createWeightMatrix(long nIn,
long nOut,
IWeightInit weightInit,
INDArray weightParamView,
boolean initializeParameters) |
protected INDArray |
ElementWiseParamInitializer.createWeightMatrix(long nIn,
long nOut,
IWeightInit weightInit,
INDArray weightParamView,
boolean initializeParameters) |
Modifier and Type | Field and Description |
---|---|
protected IWeightInit |
FineTuneConfiguration.weightInitFn |
Modifier and Type | Method and Description |
---|---|
TransferLearning.Builder |
TransferLearning.Builder.nInReplace(int layerNum,
int nIn,
IWeightInit scheme)
Modify the architecture of a vertex layer by changing nIn of the specified layer.
Note that only the specified layer will be modified - all other layers will not be changed by this call. |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nInReplace(String layerName,
int nIn,
IWeightInit scheme)
Modify the architecture of a vertex layer by changing nIn of the specified layer.
Note that only the specified layer will be modified - all other layers will not be changed by this call. |
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
IWeightInit scheme,
IWeightInit schemeNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightInit(IWeightInit weightInit)
Weight initialization scheme to use, for initial weight values
|
Modifier and Type | Class and Description |
---|---|
class |
WeightInitConstant
Initialize to a constant value (deafult 0).
|
class |
WeightInitDistribution
Sample weights from a provided
Distribution Note that Distribution is not extendable as it is interpreted through Distributions.createDistribution(Distribution) . |
class |
WeightInitIdentity
Weights are set to an identity matrix.
|
class |
WeightInitLecunUniform
Uniform U[-a,a] with a=3/sqrt(fanIn).
|
class |
WeightInitNormal
Normal/Gaussian distribution, with mean 0 and standard deviation 1/sqrt(fanIn).
|
class |
WeightInitRelu
: He et al.
|
class |
WeightInitReluUniform
He et al.
|
class |
WeightInitSigmoidUniform
A version of
WeightInitXavierUniform for sigmoid activation functions. |
class |
WeightInitUniform
Uniform U[-a,a] with a=1/sqrt(fanIn).
|
class |
WeightInitVarScalingNormalFanAvg
Truncated aussian distribution with mean 0, variance 1.0/((fanIn + fanOut)/2)
|
class |
WeightInitVarScalingNormalFanIn
Gaussian distribution with mean 0, variance
1.0/(fanIn) If a scale is provided, use variance scale/(fanIn) instead |
class |
WeightInitVarScalingNormalFanOut
Truncated normal distribution with mean 0, variance 1.0/(fanOut)
If a scale is provided, variance is scale / fanOut |
class |
WeightInitVarScalingUniformFanAvg
Uniform U[-a,a] with a=3.0/((fanIn + fanOut)/2)
|
class |
WeightInitVarScalingUniformFanIn
Uniform U[-a,a] with a=3.0/(fanIn)
If a scale is provided, a = 3.0 * scale / (fanIn) |
class |
WeightInitVarScalingUniformFanOut
Uniform U[-a,a] with a=3.0/(fanOut)
If a scale is provided, a = 3.0 * scale / fanOut |
class |
WeightInitXavier
As per Glorot and Bengio 2010: Gaussian distribution with mean 0, variance 2.0/(fanIn + fanOut)
|
class |
WeightInitXavierLegacy
Xavier weight init in DL4J up to 0.6.0.
|
class |
WeightInitXavierUniform
As per Glorot and Bengio 2010: Uniform distribution U(-s,s) with s = sqrt(6/(fanIn + fanOut))
|
Modifier and Type | Method and Description |
---|---|
IWeightInit |
WeightInit.getWeightInitFunction()
Create an instance of the weight initialization function
|
IWeightInit |
WeightInit.getWeightInitFunction(Distribution distribution)
Create an instance of the weight initialization function
|
Modifier and Type | Class and Description |
---|---|
class |
WeightInitEmbedding
Weight initialization for initializing the parameters of an EmbeddingLayer from a
EmbeddingInitializer
Note: WeightInitEmbedding supports both JSON serializable and non JSON serializable initializations. |
Copyright © 2019. All rights reserved.