Package | Description |
---|---|
org.deeplearning4j.optimize.api | |
org.deeplearning4j.optimize.solvers | |
org.deeplearning4j.optimize.solvers.accumulation | |
org.deeplearning4j.optimize.stepfunctions |
Modifier and Type | Method and Description |
---|---|
StepFunction |
ConvexOptimizer.getStepFunction()
This method returns StepFunction defined within this Optimizer instance
|
Modifier and Type | Field and Description |
---|---|
protected StepFunction |
BaseOptimizer.stepFunction |
Modifier and Type | Method and Description |
---|---|
static StepFunction |
BaseOptimizer.getDefaultStepFunctionForOptimizer(Class<? extends ConvexOptimizer> optimizerClass) |
Constructor and Description |
---|
BackTrackLineSearch(Model layer,
StepFunction stepFunction,
ConvexOptimizer optimizer) |
BaseOptimizer(NeuralNetConfiguration conf,
StepFunction stepFunction,
Collection<TrainingListener> trainingListeners,
Model model) |
ConjugateGradient(NeuralNetConfiguration conf,
StepFunction stepFunction,
Collection<TrainingListener> trainingListeners,
Model model) |
LBFGS(NeuralNetConfiguration conf,
StepFunction stepFunction,
Collection<TrainingListener> trainingListeners,
Model model) |
LineGradientDescent(NeuralNetConfiguration conf,
StepFunction stepFunction,
Collection<TrainingListener> trainingListeners,
Model model) |
StochasticGradientDescent(NeuralNetConfiguration conf,
StepFunction stepFunction,
Collection<TrainingListener> trainingListeners,
Model model) |
Modifier and Type | Method and Description |
---|---|
void |
GradientsAccumulator.applyUpdate(StepFunction function,
INDArray params,
INDArray updates)
This method applies accumulated updates via given StepFunction
|
void |
BasicGradientsAccumulator.applyUpdate(StepFunction function,
INDArray params,
INDArray grad)
This method applies accumulated updates via given StepFunction
|
void |
EncodedGradientsAccumulator.applyUpdate(StepFunction function,
INDArray params,
INDArray updates)
This method applies accumulated updates via given StepFunction
|
void |
GradientsAccumulator.applyUpdate(StepFunction function,
INDArray params,
INDArray updates,
double alpha)
This method applies accumulated updates via given StepFunction
|
void |
BasicGradientsAccumulator.applyUpdate(StepFunction function,
INDArray params,
INDArray grad,
double alpha)
This method applies accumulated updates via given StepFunction
|
void |
EncodedGradientsAccumulator.applyUpdate(StepFunction function,
INDArray params,
INDArray updates,
double alpha)
This method applies accumulated updates via given StepFunction
|
Modifier and Type | Class and Description |
---|---|
class |
DefaultStepFunction
Default step function
|
class |
GradientStepFunction
Normal gradient step function
|
class |
NegativeDefaultStepFunction
Inverse step function
|
class |
NegativeGradientStepFunction
Subtract the line
|
Modifier and Type | Method and Description |
---|---|
static StepFunction |
StepFunctions.createStepFunction(StepFunction stepFunction) |
Copyright © 2018. All rights reserved.