Class Layer.Builder<T extends Layer.Builder<T>>

    • Constructor Detail

      • Builder

        public Builder()
    • Method Detail

      • name

        public T name​(String layerName)
        Layer name assigns layer string name. Allows easier differentiation between layers.
      • dropOut

        public T dropOut​(double inputRetainProbability)
        Dropout probability. This is the probability of retaining each input activation value for a layer. dropOut(x) will keep an input activation with probability x, and set to 0 with probability 1-x.
        dropOut(0.0) is a special value / special case - when set to 0.0., dropout is disabled (not applied). Note that a dropout value of 1.0 is functionally equivalent to no dropout: i.e., 100% probability of retaining each input activation.
        When useDropConnect(boolean) is set to true (false by default), this method sets the drop connect probability instead.

        Note 1: Dropout is applied at training time only - and is automatically not applied at test time (for evaluation, etc)
        Note 2: This sets the probability per-layer. Care should be taken when setting lower values for complex networks (too much information may be lost with aggressive (very low) dropout values).
        Note 3: Frequently, dropout is not applied to (or, has higher retain probability for) input (first layer) layers. Dropout is also often not applied to output layers. This needs to be handled MANUALLY by the user - set .dropout(0) on those layers when using global dropout setting.
        Note 4: Implementation detail (most users can ignore): DL4J uses inverted dropout, as described here: http://cs231n.github.io/neural-networks-2/

        Parameters:
        inputRetainProbability - Dropout probability (probability of retaining each input activation value for a layer)
        See Also:
        dropOut(IDropout)
      • constrainAllParameters

        public T constrainAllParameters​(LayerConstraint... constraints)
        Set constraints to be applied to this layer. Default: no constraints.
        Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.
        Parameters:
        constraints - Constraints to apply to all parameters of this layer
      • constrainBias

        public T constrainBias​(LayerConstraint... constraints)
        Set constraints to be applied to bias parameters of this layer. Default: no constraints.
        Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.
        Parameters:
        constraints - Constraints to apply to all bias parameters of this layer
      • constrainWeights

        public T constrainWeights​(LayerConstraint... constraints)
        Set constraints to be applied to the weight parameters of this layer. Default: no constraints.
        Constraints can be used to enforce certain conditions (non-negativity of parameters, max-norm regularization, etc). These constraints are applied at each iteration, after the parameters have been updated.
        Parameters:
        constraints - Constraints to apply to all weight parameters of this layer
      • build

        public abstract <E extends Layer> E build()