Package parsimony :: Package algorithms :: Module gradient :: Class GradientDescent
[hide private]
[frames] | no frames]

Class GradientDescent

source code

            object --+        
                     |        
   bases.BaseAlgorithm --+    
                         |    
   bases.ExplicitAlgorithm --+
                             |
                object --+   |
                         |   |
  bases.IterativeAlgorithm --+
                             |
                object --+   |
                         |   |
bases.InformationAlgorithm --+
                             |
                            GradientDescent

The gradient descent algorithm.

Parameters
----------
eps : Positive float. Tolerance for the stopping criterion.

info : List or tuple of utils.consts.Info. What, if any, extra run
        information should be stored. Default is an empty list, which means
        that no run information is computed nor returned.

max_iter : Non-negative integer. Maximum allowed number of iterations.
        Default is 20000.

min_iter : Non-negative integer less than or equal to max_iter. Minimum
        number of iterations that must be performed. Default is 1.

Examples
--------
>>> from parsimony.algorithms.gradient import GradientDescent
>>> from parsimony.functions.losses import RidgeRegression
>>> import numpy as np
>>> np.random.seed(42)
>>> X = np.random.rand(100, 50)
>>> y = np.random.rand(100, 1)
>>> gd = GradientDescent(max_iter=10000)
>>> function = RidgeRegression(X, y, k=0.0, mean=False)
>>> beta1 = gd.run(function, np.random.rand(50, 1))
>>> beta2 = np.dot(np.linalg.pinv(X), y)
>>> round(np.linalg.norm(beta1 - beta2), 13)
0.0003121557633

Instance Methods [hide private]
 
__init__(self, eps=5e-08, info=[], max_iter=20000, min_iter=1)
x.__init__(...) initializes x; see help(type(x)) for signature
source code
 
run(self, function, *args, **kwargs)
Find the minimiser of the given function, starting at beta.
source code

Inherited from bases.BaseAlgorithm: get_params, set_params

Inherited from bases.IterativeAlgorithm: iter_reset

Inherited from bases.InformationAlgorithm: check_info_compatibility, info_copy, info_get, info_provided, info_requested, info_reset, info_set

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __repr__, __setattr__, __sizeof__, __str__, __subclasshook__

Static Methods [hide private]

Inherited from bases.BaseAlgorithm: check_compatibility

Class Variables [hide private]
  INTERFACES = [<class 'parsimony.functions.properties.Function'...
  INFO_PROVIDED = ['ok', 'num_iter', 'time', 'fvalue', 'converged']
  __abstractmethods__ = frozenset([])

Inherited from bases.ExplicitAlgorithm: __metaclass__

Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, eps=5e-08, info=[], max_iter=20000, min_iter=1)
(Constructor)

source code 

x.__init__(...) initializes x; see help(type(x)) for signature

Overrides: object.__init__
(inherited documentation)

run(self, function, *args, **kwargs)

source code 
Find the minimiser of the given function, starting at beta.

Parameters
----------
function : Function. The function to minimise.

beta : Numpy array. The start vector.

Decorators:
  • @bases.force_reset
  • @bases.check_compatibility
Overrides: bases.ExplicitAlgorithm.run

Class Variable Details [hide private]

INTERFACES

Value:
[<class 'parsimony.functions.properties.Function'>,
 <class 'parsimony.functions.properties.Gradient'>,
 <class 'parsimony.functions.properties.StepSize'>]