Algorithms
The first argument of an Algorithm
's constructor is an SModel
. This is to ensure storage buffers are the correct size.
ProxGrad
SparseRegression.ProxGrad
— TypeProxGrad(model, step = 1.0)
Proximal gradient method with step size step
. Works for any loss and any penalty with a prox
method.
Example
x, y, β = SparseRegression.fakedata(L2DistLoss(), 1000, 10)
s = SModel(x, y, L2DistLoss())
strat = strategy(MaxIter(50), ProxGrad(s))
learn!(s, strat)
Fista
SparseRegression.Fista
— TypeFista(model, step = 1.0)
Accelerated proximal gradient method. Works for any loss and any penalty with a prox
method.
AdaptiveProxGrad
SparseRegression.AdaptiveProxGrad
— TypeAdaptiveProxGrad(s, divisor = 1.5, init = 1.0)
Proximal gradient method with adaptive step sizes. AdaptiveProxGrad uses element-wise learning rates. Every time the sign of a coefficient switches, the step size for that coefficient is divided by divisor
.
GradientDescent
SparseRegression.GradientDescent
— TypeGradientDescent(model, step = 1.0)
Gradient Descent. Works for any loss and any penalty.
Example
x, y, β = SparseRegression.fakedata(L2DistLoss(), 1000, 10)
s = SModel(x, y, L2DistLoss())
strat = strategy(MaxIter(50), GradientDescent(s))
learn!(s, strat)
Sweep
SparseRegression.Sweep
— TypeSweep(model)
Linear/ridge regression via sweep operator. Works for (scaled) L2DistLoss with NoPenalty or L2Penalty. The Sweep
algorithm has a closed form solution and is complete after one iteration. It therefore doesn't need additional learning strategies such as MaxIter
, Converged
, etc.
Example
x, y, β = SparseRegression.fakedata(L2DistLoss(), 1000, 10)
s = SModel(x, y, L2DistLoss())
learn!(s, Sweep(s))
LinRegCholesky
SparseRegression.LinRegCholesky
— TypeLinRegCholesky(model)
Linear/ridge regression via cholesky decomposition. Works for (scaled) L2DistLoss with NoPenalty or L2Penalty. The LinRegCholesky
algorithm has a closed form solution and is complete after one iteration. It therefore doesn't need additional learning strategies such as MaxIter
, Converged
, etc.
Example
x, y, β = SparseRegression.fakedata(L2DistLoss(), 1000, 10)
s = SModel(x, y, L2DistLoss())
learn!(s, Sweep(s))
LineSearch
SparseRegression.LineSearch
— TypeLineSearch(algorithm)
Use a line search in the update!
of algorithm
. Currently, ProxGrad
, Fista
, and GradientDescent
are supported.
Example
x, y, β = SparseRegression.fakedata(L2DistLoss(), 1000, 10)
s = SModel(x, y, L2DistLoss())
strat = strategy(MaxIter(50), LineSearch(ProxGrad(s)))
learn!(s, strat)