public abstract static class Layer.Builder<T extends Layer.Builder<T>> extends Object
Modifier and Type | Field and Description |
---|---|
protected List<LayerConstraint> |
allParamConstraints |
protected List<LayerConstraint> |
biasConstraints |
protected IDropout |
iDropout |
protected String |
layerName |
protected List<LayerConstraint> |
weightConstraints |
Constructor and Description |
---|
Builder() |
Modifier and Type | Method and Description |
---|---|
abstract <E extends Layer> |
build() |
T |
constrainAllParameters(LayerConstraint... constraints)
Set constraints to be applied to this layer.
|
T |
constrainBias(LayerConstraint... constraints)
Set constraints to be applied to bias parameters of this layer.
|
T |
constrainWeights(LayerConstraint... constraints)
Set constraints to be applied to the weight parameters of this layer.
|
T |
dropOut(double inputRetainProbability)
Dropout probability.
|
T |
dropOut(IDropout dropout)
Set the dropout for all layers in this network
|
T |
name(String layerName)
Layer name assigns layer string name.
|
protected String layerName
protected List<LayerConstraint> allParamConstraints
protected List<LayerConstraint> weightConstraints
protected List<LayerConstraint> biasConstraints
protected IDropout iDropout
public T name(String layerName)
public T dropOut(double inputRetainProbability)
Note 1: Dropout is applied at training time only - and is automatically not applied at test time
(for evaluation, etc)
Note 2: This sets the probability per-layer. Care should be taken when setting lower values for
complex networks (too much information may be lost with aggressive (very low) dropout values).
Note 3: Frequently, dropout is not applied to (or, has higher retain probability for) input (first layer)
layers. Dropout is also often not applied to output layers. This needs to be handled MANUALLY by the user
- set .dropout(0) on those layers when using global dropout setting.
Note 4: Implementation detail (most users can ignore): DL4J uses inverted dropout, as described here:
http://cs231n.github.io/neural-networks-2/
inputRetainProbability
- Dropout probability (probability of retaining each input activation value for a layer)dropOut(IDropout)
public T dropOut(IDropout dropout)
dropout
- Dropout, such as Dropout
, GaussianDropout
,
GaussianNoise
etcpublic T constrainAllParameters(LayerConstraint... constraints)
constraints
- Constraints to apply to all parameters of this layerpublic T constrainBias(LayerConstraint... constraints)
constraints
- Constraints to apply to all bias parameters of this layerpublic T constrainWeights(LayerConstraint... constraints)
constraints
- Constraints to apply to all weight parameters of this layerpublic abstract <E extends Layer> E build()
Copyright © 2018. All rights reserved.