Package org.nd4j.linalg.learning
Class AdaGradUpdater
- java.lang.Object
-
- org.nd4j.linalg.learning.AdaGradUpdater
-
- All Implemented Interfaces:
GradientUpdater<AdaGrad>
public class AdaGradUpdater extends Object implements GradientUpdater<AdaGrad>
-
-
Field Summary
Fields Modifier and Type Field Description static String
GRAD_STATE
INDArray
historicalGradient
protected double
learningRate
protected int
numIterations
int[]
shape
-
Constructor Summary
Constructors Constructor Description AdaGradUpdater(AdaGrad config)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description void
applyUpdater(INDArray gradient, int iteration, int epoch)
Gets feature specific learning rates Adagrad keeps a history of gradients being passed in.Map<String,INDArray>
getState()
void
setState(Map<String,INDArray> stateMap, boolean initialize)
void
setStateViewArray(INDArray viewArray, long[] gradientShape, char gradientOrder, boolean initialize)
For the internal updater state (if any): set this to use the provided array.-
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.nd4j.linalg.learning.GradientUpdater
getConfig
-
-
-
-
Field Detail
-
GRAD_STATE
public static final String GRAD_STATE
- See Also:
- Constant Field Values
-
historicalGradient
public INDArray historicalGradient
-
shape
public int[] shape
-
learningRate
protected double learningRate
-
numIterations
protected int numIterations
-
-
Constructor Detail
-
AdaGradUpdater
public AdaGradUpdater(AdaGrad config)
-
-
Method Detail
-
setState
public void setState(Map<String,INDArray> stateMap, boolean initialize)
- Specified by:
setState
in interfaceGradientUpdater<AdaGrad>
-
getState
public Map<String,INDArray> getState()
- Specified by:
getState
in interfaceGradientUpdater<AdaGrad>
-
setStateViewArray
public void setStateViewArray(INDArray viewArray, long[] gradientShape, char gradientOrder, boolean initialize)
Description copied from interface:GradientUpdater
For the internal updater state (if any): set this to use the provided array. Used during initialization, and when restoring the updater state (after serialization, for example)- Specified by:
setStateViewArray
in interfaceGradientUpdater<AdaGrad>
- Parameters:
viewArray
- Array (that is a view of a larger array) to use for the state.initialize
- If true: the updater must initialize the view array. If false: no change to view array contents
-
applyUpdater
public void applyUpdater(INDArray gradient, int iteration, int epoch)
Gets feature specific learning rates Adagrad keeps a history of gradients being passed in. Note that each gradient passed in becomes adapted over time, hence the opName adagrad- Specified by:
applyUpdater
in interfaceGradientUpdater<AdaGrad>
- Parameters:
gradient
- the gradient to get learning rates foriteration
-
-
-