public class AdaGrad extends Object implements Serializable
Modifier and Type | Field and Description |
---|---|
org.jblas.DoubleMatrix |
adjustedGradient |
int |
cols |
double |
fudgeFactor |
org.jblas.DoubleMatrix |
gradient |
org.jblas.DoubleMatrix |
historicalGradient |
int |
rows |
Constructor and Description |
---|
AdaGrad(int rows,
int cols) |
AdaGrad(int rows,
int cols,
double gamma) |
Modifier and Type | Method and Description |
---|---|
org.jblas.DoubleMatrix |
getLearningRates(org.jblas.DoubleMatrix gradient)
Gets feature specific learning rates
Adagrad keeps a history of gradients being passed in.
|
double |
getMasterStepSize() |
boolean |
isDecayLr() |
void |
setDecayLr(boolean decayLr) |
void |
setMasterStepSize(double masterStepSize) |
public org.jblas.DoubleMatrix historicalGradient
public org.jblas.DoubleMatrix adjustedGradient
public double fudgeFactor
public org.jblas.DoubleMatrix gradient
public int rows
public int cols
public AdaGrad(int rows, int cols, double gamma)
public AdaGrad(int rows, int cols)
public org.jblas.DoubleMatrix getLearningRates(org.jblas.DoubleMatrix gradient)
gradient
- the gradient to get learning rates forpublic double getMasterStepSize()
public void setMasterStepSize(double masterStepSize)
public boolean isDecayLr()
public void setDecayLr(boolean decayLr)
Copyright © 2014. All Rights Reserved.