Interface Regularization
-
- All Superinterfaces:
Serializable
- All Known Implementing Classes:
L1Regularization
,L2Regularization
,WeightDecay
public interface Regularization extends Serializable
-
-
Nested Class Summary
Nested Classes Modifier and Type Interface Description static class
Regularization.ApplyStep
ApplyStep determines how the regularization interacts with the optimization process - i.e., when it is applied relative to updaters like Adam, Nesterov momentum, SGD, etc.
-
Method Summary
All Methods Instance Methods Abstract Methods Modifier and Type Method Description void
apply(INDArray param, INDArray gradView, double lr, int iteration, int epoch)
Apply the regularization by modifying the gradient array in-placeRegularization.ApplyStep
applyStep()
Regularization
clone()
double
score(INDArray param, int iteration, int epoch)
Calculate the loss function score component for the regularization.
For example, in L2 regularization, this would returnL = 0.5 * sum_i param[i]^2
For regularization types that don't have a score component, this method can return 0.
-
-
-
Method Detail
-
applyStep
Regularization.ApplyStep applyStep()
- Returns:
- The step that the regularization should be applied, as defined by
Regularization.ApplyStep
-
apply
void apply(INDArray param, INDArray gradView, double lr, int iteration, int epoch)
Apply the regularization by modifying the gradient array in-place- Parameters:
param
- Input array (usually parameters)gradView
- Gradient view array (should be modified/updated). Same shape and type as the input array.lr
- Current learning rateiteration
- Current network training iterationepoch
- Current network training epoch
-
score
double score(INDArray param, int iteration, int epoch)
Calculate the loss function score component for the regularization.
For example, in L2 regularization, this would returnL = 0.5 * sum_i param[i]^2
For regularization types that don't have a score component, this method can return 0. However, note that this may make the regularization type not gradient checkable.- Parameters:
param
- Input array (usually parameters)iteration
- Current network training iterationepoch
- Current network training epoch- Returns:
- Loss function score component based on the input/parameters array
-
clone
Regularization clone()
- Returns:
- An independent copy of the regularization instance
-
-