Class Layer.Builder<T extends Layer.Builder<T>>
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer.Builder<T>
-
- Direct Known Subclasses:
AbstractSameDiffLayer.Builder
,ActivationLayer.Builder
,BaseLayer.Builder
,BaseUpsamplingLayer.UpsamplingBuilder
,Bidirectional.Builder
,Cropping1D.Builder
,Cropping2D.Builder
,Cropping3D.Builder
,FrozenLayer.Builder
,GlobalPoolingLayer.Builder
,LocalResponseNormalization.Builder
,MaskZeroLayer.Builder
,SpaceToBatchLayer.Builder
,SpaceToDepthLayer.Builder
,Subsampling3DLayer.BaseSubsamplingBuilder
,SubsamplingLayer.BaseSubsamplingBuilder
,Yolo2OutputLayer.Builder
,ZeroPadding1DLayer.Builder
,ZeroPadding3DLayer.Builder
,ZeroPaddingLayer.Builder
- Enclosing class:
- Layer
public abstract static class Layer.Builder<T extends Layer.Builder<T>> extends Object
-
-
Field Summary
Fields Modifier and Type Field Description protected List<LayerConstraint>
allParamConstraints
protected List<LayerConstraint>
biasConstraints
protected IDropout
iDropout
protected String
layerName
protected List<LayerConstraint>
weightConstraints
-
Constructor Summary
Constructors Constructor Description Builder()
-
Method Summary
All Methods Instance Methods Abstract Methods Concrete Methods Modifier and Type Method Description abstract <E extends Layer>
Ebuild()
T
constrainAllParameters(LayerConstraint... constraints)
Set constraints to be applied to this layer.T
constrainBias(LayerConstraint... constraints)
Set constraints to be applied to bias parameters of this layer.T
constrainWeights(LayerConstraint... constraints)
Set constraints to be applied to the weight parameters of this layer.T
dropOut(double inputRetainProbability)
Dropout probability.T
dropOut(IDropout dropout)
Set the dropout for all layers in this networkT
name(String layerName)
Layer name assigns layer string name.
-
-
-
Field Detail
-
layerName
protected String layerName
-
allParamConstraints
protected List<LayerConstraint> allParamConstraints
-
weightConstraints
protected List<LayerConstraint> weightConstraints
-
biasConstraints
protected List<LayerConstraint> biasConstraints
-
iDropout
protected IDropout iDropout
-
-
Method Detail
-
name
public T name(String layerName)
Layer name assigns layer string name. Allows easier differentiation between layers.
-
dropOut
public T dropOut(double inputRetainProbability)
Dropout probability. This is the probability ofretaining each input activation value for a layer. dropOut(x) will keep an input activation with probability x, and set to 0 with probability 1-x.
dropOut(0.0) is a special value / special case - when set to 0.0., dropout is disabled (not applied). Note that a dropout value of 1.0 is functionally equivalent to no dropout: i.e., 100% probability of retaining each input activation.
When useDropConnect(boolean) is set to true (false by default), this method sets the drop connect probability instead.Note 1: Dropout is applied at training time only - and is automatically not applied at test time (for evaluation, etc)
Note 2: This sets the probability per-layer. Care should be taken when setting lower values for complex networks (too much information may be lost with aggressive (very low) dropout values).
Note 3: Frequently, dropout is not applied to (or, has higher retain probability for) input (first layer) layers. Dropout is also often not applied to output layers. This needs to be handled MANUALLY by the user - set .dropout(0) on those layers when using global dropout setting.
Note 4: Implementation detail (most users can ignore): DL4J uses inverted dropout, as described here: http://cs231n.github.io/neural-networks-2/- Parameters:
inputRetainProbability
- Dropout probability (probability of retaining each input activation value for a layer)- See Also:
dropOut(IDropout)
-
dropOut
public T dropOut(IDropout dropout)
Set the dropout for all layers in this network- Parameters:
dropout
- Dropout, such asDropout
,GaussianDropout
,GaussianNoise
etc
-
constrainAllParameters
public T constrainAllParameters(LayerConstraint... constraints)
Set constraints to be applied to this layer. Default: no constraints.
Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.- Parameters:
constraints
- Constraints to apply to all parameters of this layer
-
constrainBias
public T constrainBias(LayerConstraint... constraints)
Set constraints to be applied to bias parameters of this layer. Default: no constraints.
Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.- Parameters:
constraints
- Constraints to apply to all bias parameters of this layer
-
constrainWeights
public T constrainWeights(LayerConstraint... constraints)
Set constraints to be applied to the weight parameters of this layer. Default: no constraints.
Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.- Parameters:
constraints
- Constraints to apply to all weight parameters of this layer
-
build
public abstract <E extends Layer> E build()
-
-