Package smile.deep.optimizer
Class RMSProp
java.lang.Object
smile.deep.optimizer.RMSProp
- All Implemented Interfaces:
Serializable
,Optimizer
RMSProp optimizer with adaptive learning rate. RMSProp uses a moving
average of squared gradients to normalize the gradient. This
normalization balances the step size (momentum), decreasing the step
for large gradients to avoid exploding, and increasing the step for
small gradients to avoid vanishing.
- See Also:
-
Constructor Summary
Constructors -
Method Summary
-
Constructor Details
-
RMSProp
public RMSProp()Constructor. -
RMSProp
public RMSProp(smile.math.TimeFunction learningRate) Constructor.- Parameters:
learningRate
- the learning rate.
-
RMSProp
public RMSProp(smile.math.TimeFunction learningRate, double rho, double epsilon) Constructor.- Parameters:
learningRate
- the learning rate.rho
- the discounting factor for the history/coming gradient.epsilon
- a small constant for numerical stability.
-
-
Method Details