Package smile.deep.activation
Class LeakyReLU
java.lang.Object
smile.deep.activation.LeakyReLU
- All Implemented Interfaces:
Serializable
,ActivationFunction
The leaky rectifier activation function
max(x, ax)
where
0 <= a < 1
. By default a = 0.01
. Leaky ReLUs allow
a small, positive gradient when the unit is not active.
It has a relation to "maxout" networks.- See Also:
-
Constructor Summary
Constructors -
Method Summary
-
Constructor Details
-
LeakyReLU
public LeakyReLU(double a) Constructor.- Parameters:
a
- leaky parameter0 <= a < 1
.
-
-
Method Details
-
name
Description copied from interface:ActivationFunction
Returns the name of activation function.- Specified by:
name
in interfaceActivationFunction
- Returns:
- the name of activation function.
-
f
public void f(double[] x) Description copied from interface:ActivationFunction
The output function.- Specified by:
f
in interfaceActivationFunction
- Parameters:
x
- the input vector.
-
g
public void g(double[] g, double[] y) Description copied from interface:ActivationFunction
The gradient function.- Specified by:
g
in interfaceActivationFunction
- Parameters:
g
- the gradient vector. On input, it holds W'*g, where W and g are the weight matrix and gradient of upper layer, respectively. On output, it is the gradient of this layer.y
- the output vector.
-