Home | Trees | Indices | Help |
---|
|
object --+ | bases.BaseAlgorithm --+ | bases.ExplicitAlgorithm --+ | object --+ | | | bases.IterativeAlgorithm --+ | object --+ | | | bases.InformationAlgorithm --+ | GradientDescent
The gradient descent algorithm. Parameters ---------- eps : Positive float. Tolerance for the stopping criterion. info : List or tuple of utils.consts.Info. What, if any, extra run information should be stored. Default is an empty list, which means that no run information is computed nor returned. max_iter : Non-negative integer. Maximum allowed number of iterations. Default is 20000. min_iter : Non-negative integer less than or equal to max_iter. Minimum number of iterations that must be performed. Default is 1. Examples -------- >>> from parsimony.algorithms.gradient import GradientDescent >>> from parsimony.functions.losses import RidgeRegression >>> import numpy as np >>> np.random.seed(42) >>> X = np.random.rand(100, 50) >>> y = np.random.rand(100, 1) >>> gd = GradientDescent(max_iter=10000) >>> function = RidgeRegression(X, y, k=0.0, mean=False) >>> beta1 = gd.run(function, np.random.rand(50, 1)) >>> beta2 = np.dot(np.linalg.pinv(X), y) >>> round(np.linalg.norm(beta1 - beta2), 13) 0.0003121557633
|
|||
|
|||
|
|||
Inherited from Inherited from Inherited from Inherited from |
|
|||
Inherited from |
|
|||
INTERFACES =
|
|||
INFO_PROVIDED =
|
|||
__abstractmethods__ =
|
|||
Inherited from Inherited from |
|
|||
Inherited from |
|
x.__init__(...) initializes x; see help(type(x)) for signature
|
Find the minimiser of the given function, starting at beta. Parameters ---------- function : Function. The function to minimise. beta : Numpy array. The start vector.
|
|
INTERFACES
|
Home | Trees | Indices | Help |
---|
Generated by Epydoc 3.0.1 on Mon Apr 6 23:52:10 2015 | http://epydoc.sourceforge.net |