Class LeakyReLU

java.lang.Object
smile.deep.activation.LeakyReLU
All Implemented Interfaces:
Serializable, ActivationFunction

public class LeakyReLU extends Object implements ActivationFunction
The leaky rectifier activation function max(x, ax) where 0 <= a < 1. By default a = 0.01. Leaky ReLUs allow a small, positive gradient when the unit is not active. It has a relation to "maxout" networks.
See Also:
  • Constructor Details

    • LeakyReLU

      public LeakyReLU(double a)
      Constructor.
      Parameters:
      a - leaky parameter 0 <= a < 1.
  • Method Details

    • name

      public String name()
      Description copied from interface: ActivationFunction
      Returns the name of activation function.
      Specified by:
      name in interface ActivationFunction
      Returns:
      the name of activation function.
    • f

      public void f(double[] x)
      Description copied from interface: ActivationFunction
      The output function.
      Specified by:
      f in interface ActivationFunction
      Parameters:
      x - the input vector.
    • g

      public void g(double[] g, double[] y)
      Description copied from interface: ActivationFunction
      The gradient function.
      Specified by:
      g in interface ActivationFunction
      Parameters:
      g - the gradient vector. On input, it holds W'*g, where W and g are the weight matrix and gradient of upper layer, respectively. On output, it is the gradient of this layer.
      y - the output vector.