public interface ConvexOptimizer extends Serializable
Modifier and Type | Method and Description |
---|---|
int |
batchSize()
The batch size for the optimizer
|
boolean |
checkTerminalConditions(org.nd4j.linalg.api.ndarray.INDArray gradient,
double oldScore,
double score,
int iteration)
Check termination conditions
setup a search state
|
ComputationGraphUpdater |
getComputationGraphUpdater() |
NeuralNetConfiguration |
getConf() |
GradientsAccumulator |
getGradientsAccumulator()
This method returns GradientsAccumulator instance used in this optimizer.
|
StepFunction |
getStepFunction()
This method returns StepFunction defined within this Optimizer instance
|
Updater |
getUpdater() |
org.nd4j.linalg.primitives.Pair<Gradient,Double> |
gradientAndScore(LayerWorkspaceMgr workspaceMgr)
The gradient and score for this optimizer
|
boolean |
optimize(LayerWorkspaceMgr workspaceMgr)
Calls optimize
|
void |
postStep(org.nd4j.linalg.api.ndarray.INDArray line)
After the step has been made, do an action
|
void |
preProcessLine()
Pre preProcess a line before an iteration
|
double |
score()
The score for the optimizer so far
|
void |
setBatchSize(int batchSize)
Set the batch size for the optimizer
|
void |
setGradientsAccumulator(GradientsAccumulator accumulator)
This method specifies GradientsAccumulator instance to be used for updates sharing across multiple models
|
void |
setListeners(Collection<TrainingListener> listeners) |
void |
setUpdater(Updater updater) |
void |
setUpdaterComputationGraph(ComputationGraphUpdater updater) |
void |
setupSearchState(org.nd4j.linalg.primitives.Pair<Gradient,Double> pair)
Based on the gradient and score
setup a search state
|
void |
updateGradientAccordingToParams(Gradient gradient,
Model model,
int batchSize,
LayerWorkspaceMgr workspaceMgr)
Update the gradient according to the configuration such as adagrad, momentum, and sparsity
|
double score()
Updater getUpdater()
ComputationGraphUpdater getComputationGraphUpdater()
void setUpdater(Updater updater)
void setUpdaterComputationGraph(ComputationGraphUpdater updater)
void setListeners(Collection<TrainingListener> listeners)
void setGradientsAccumulator(GradientsAccumulator accumulator)
accumulator
- StepFunction getStepFunction()
GradientsAccumulator getGradientsAccumulator()
NeuralNetConfiguration getConf()
org.nd4j.linalg.primitives.Pair<Gradient,Double> gradientAndScore(LayerWorkspaceMgr workspaceMgr)
boolean optimize(LayerWorkspaceMgr workspaceMgr)
int batchSize()
void setBatchSize(int batchSize)
batchSize
- void preProcessLine()
void postStep(org.nd4j.linalg.api.ndarray.INDArray line)
line
- void setupSearchState(org.nd4j.linalg.primitives.Pair<Gradient,Double> pair)
pair
- the gradient and scorevoid updateGradientAccordingToParams(Gradient gradient, Model model, int batchSize, LayerWorkspaceMgr workspaceMgr)
gradient
- the gradient to modifymodel
- the model with the parameters to updatebatchSize
- batchSize for updateboolean checkTerminalConditions(org.nd4j.linalg.api.ndarray.INDArray gradient, double oldScore, double score, int iteration)
gradient
- layer gradientsiteration
- what iteration the optimizer is onCopyright © 2018. All rights reserved.