Package smile.deep.activation
Interface ActivationFunction
- All Superinterfaces:
Serializable
The activation function. An activation function defines how the weighted
sum of the input is transformed into an output from a node or nodes in
a layer of the network.
The choice of activation function has a large impact on the capability and performance of the neural network, and different activation functions may be used in different parts of the model although the same activation function is used for all nodes in a layer.
There are many different types of activation functions, although perhaps only a small number of functions used in practice.
-
Method Summary
Modifier and TypeMethodDescriptionvoid
f
(double[] x) The output function.void
g
(double[] g, double[] y) The gradient function.static LeakyReLU
leaky()
Returns the leaky rectifier activation functionmax(x, 0.01x)
.static LeakyReLU
leaky
(double a) Returns the leaky rectifier activation functionmax(x, ax)
where0 <= a < 1
.name()
Returns the name of activation function.static ReLU
relu()
Returns the rectifier activation functionmax(0, x)
.static Sigmoid
sigmoid()
Returns the logistic sigmoid function: sigmoid(v)=1/(1+exp(-v)).static Softmax
softmax()
Returns the softmax activation function for multi-class output layer.static Tanh
tanh()
Returns the hyperbolic tangent activation function.
-
Method Details
-
name
String name()Returns the name of activation function.- Returns:
- the name of activation function.
-
f
void f(double[] x) The output function.- Parameters:
x
- the input vector.
-
g
void g(double[] g, double[] y) The gradient function.- Parameters:
g
- the gradient vector. On input, it holds W'*g, where W and g are the weight matrix and gradient of upper layer, respectively. On output, it is the gradient of this layer.y
- the output vector.
-
relu
Returns the rectifier activation functionmax(0, x)
.- Returns:
- the rectifier activation function.
-
leaky
Returns the leaky rectifier activation functionmax(x, 0.01x)
.- Returns:
- the leaky rectifier activation function.
-
leaky
Returns the leaky rectifier activation functionmax(x, ax)
where0 <= a < 1
.- Parameters:
a
- the parameter of leaky ReLU.- Returns:
- the leaky rectifier activation function.
-
sigmoid
Returns the logistic sigmoid function: sigmoid(v)=1/(1+exp(-v)).- Returns:
- the logistic sigmoid activation function.
-
tanh
Returns the hyperbolic tangent activation function. The tanh function is a rescaling of the logistic sigmoid, such that its outputs range from -1 to 1.- Returns:
- the hyperbolic tangent activation function.
-
softmax
Returns the softmax activation function for multi-class output layer.- Returns:
- the softmax activation function.
-