public class LBFGS extends BaseOptimizer
computationGraphUpdater, conf, GRADIENT_KEY, iteration, iterationListeners, lineMaximizer, log, model, oldScore, PARAMS_KEY, score, SCORE_KEY, SEARCH_DIR, searchState, step, stepFunction, stepMax, terminationConditions, updater
Constructor and Description |
---|
LBFGS(NeuralNetConfiguration conf,
StepFunction stepFunction,
Collection<IterationListener> iterationListeners,
Collection<TerminationCondition> terminationConditions,
Model model) |
LBFGS(NeuralNetConfiguration conf,
StepFunction stepFunction,
Collection<IterationListener> iterationListeners,
Model model) |
Modifier and Type | Method and Description |
---|---|
void |
postStep(org.nd4j.linalg.api.ndarray.INDArray gradient)
Post step to update searchDirection with new gradient and parameter information
|
void |
preProcessLine()
Pre preProcess to setup initial searchDirection approximation
|
void |
setupSearchState(Pair<Gradient,Double> pair)
Setup the initial search state
|
batchSize, checkTerminalConditions, getComputationGraphUpdater, getConf, getDefaultStepFunctionForOptimizer, getUpdater, gradientAndScore, optimize, postFirstStep, score, setBatchSize, setListeners, setUpdater, setUpdaterComputationGraph, updateGradientAccordingToParams
public LBFGS(NeuralNetConfiguration conf, StepFunction stepFunction, Collection<IterationListener> iterationListeners, Model model)
public LBFGS(NeuralNetConfiguration conf, StepFunction stepFunction, Collection<IterationListener> iterationListeners, Collection<TerminationCondition> terminationConditions, Model model)
public void setupSearchState(Pair<Gradient,Double> pair)
BaseOptimizer
setupSearchState
in interface ConvexOptimizer
setupSearchState
in class BaseOptimizer
pair
- the gradient and scorepublic void preProcessLine()
BaseOptimizer
preProcessLine
in interface ConvexOptimizer
preProcessLine
in class BaseOptimizer
public void postStep(org.nd4j.linalg.api.ndarray.INDArray gradient)
BaseOptimizer
postStep
in interface ConvexOptimizer
postStep
in class BaseOptimizer
Copyright © 2016. All Rights Reserved.