Package parsimony :: Package algorithms :: Module primaldual :: Class ExcessiveGapMethod
[hide private]
[frames] | no frames]

Class ExcessiveGapMethod

source code

            object --+        
                     |        
   bases.BaseAlgorithm --+    
                         |    
   bases.ExplicitAlgorithm --+
                             |
                object --+   |
                         |   |
  bases.IterativeAlgorithm --+
                             |
                object --+   |
                         |   |
bases.InformationAlgorithm --+
                             |
                            ExcessiveGapMethod

Nesterov's excessive gap method for strongly convex functions.

Parameters
----------
output : Boolean. Whether or not to return extra output information. If
        output is True, running the algorithm will return a tuple with two
        elements. The first element is the found regression vector, and the
        second is the extra output information.

eps : Positive float. Tolerance for the stopping criterion.

info : List or tuple of utils.consts.Info. What, if any, extra run
        information should be stored. Default is an empty list, which means
        that no run information is computed nor returned.

max_iter : Non-negative integer. Maximum allowed number of iterations.

min_iter : Non-negative integer less than or equal to max_iter. Minimum
        number of iterations that must be performed. Default is 1.

Instance Methods [hide private]
 
__init__(self, eps=5e-08, info=[], max_iter=10000, min_iter=1, simulation=False)
x.__init__(...) initializes x; see help(type(x)) for signature
source code
 
run(self, function, *args, **kwargs)
The excessive gap method for strongly convex functions.
source code

Inherited from bases.BaseAlgorithm: get_params, set_params

Inherited from bases.IterativeAlgorithm: iter_reset

Inherited from bases.InformationAlgorithm: check_info_compatibility, info_copy, info_get, info_provided, info_requested, info_reset, info_set

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __repr__, __setattr__, __sizeof__, __str__, __subclasshook__

Static Methods [hide private]

Inherited from bases.BaseAlgorithm: check_compatibility

Class Variables [hide private]
  INTERFACES = [<class 'parsimony.functions.properties.NesterovF...
  INFO_PROVIDED = ['ok', 'converged', 'num_iter', 'time', 'fvalu...
  __abstractmethods__ = frozenset([])
  _abc_negative_cache_version = 14

Inherited from bases.ExplicitAlgorithm: __metaclass__

Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, eps=5e-08, info=[], max_iter=10000, min_iter=1, simulation=False)
(Constructor)

source code 

x.__init__(...) initializes x; see help(type(x)) for signature

Overrides: object.__init__
(inherited documentation)

run(self, function, *args, **kwargs)

source code 
The excessive gap method for strongly convex functions.

Parameters
----------
function : The function to minimise. It contains two parts, function.g
        is the strongly convex part and function.h is the smoothed part
        of the function.

beta : Numpy array. A start vector. This is normally not given, but
        left None, since the start vector is computed by the algorithm.

Decorators:
  • @bases.force_reset
  • @bases.check_compatibility
Overrides: bases.ExplicitAlgorithm.run

Class Variable Details [hide private]

INTERFACES

Value:
[<class 'parsimony.functions.properties.NesterovFunction'>,
 <class 'parsimony.functions.properties.GradientMap'>,
 <class 'parsimony.functions.properties.DualFunction'>,
 <class 'parsimony.functions.properties.StronglyConvex'>]

INFO_PROVIDED

Value:
['ok',
 'converged',
 'num_iter',
 'time',
 'fvalue',
 'mu',
 'bound',
 'gap',
...