Solve the convex optimization problem.
Solve the convex optimization problem.
A | = | K + γ×I | 1 |
---|---|---|---|
1T | 0 | ||
b | = | y | |
0 |
The number of data points, i.e. also the size of matrix A
The components of the linear system (A, b) as a tuple.
An initial estimate of the linear system solution, this parameter is redundant for LSSVMLinearSolver as the exact solution is computed.
A-1b
Set fraction of data to be used for each SGD iteration.
Set fraction of data to be used for each SGD iteration. Default 1.0 (corresponding to deterministic/classical gradient descent)
Set the number of iterations for SGD.
Set the number of iterations for SGD. Default 100.
Set the regularization parameter.
Set the regularization parameter. Default 0.0.
Set the initial step size of SGD for the first step.
Set the initial step size of SGD for the first step. Default 1.0. In subsequent steps, the step size will decrease with stepSize/sqrt(t)
Solves the linear problem resulting from applying the Karush-Kuhn-Tucker conditions on the Dual Least Squares SVM optimization problem.