Package parsimony :: Package algorithms :: Module proximal :: Class StaticCONESTA
[hide private]
[frames] | no frames]

Class StaticCONESTA

source code

            object --+        
                     |        
   bases.BaseAlgorithm --+    
                         |    
   bases.ExplicitAlgorithm --+
                             |
                object --+   |
                         |   |
  bases.IterativeAlgorithm --+
                             |
                object --+   |
                         |   |
bases.InformationAlgorithm --+
                             |
                            StaticCONESTA

COntinuation with NEsterov smoothing in a Soft-Thresholding Algorithm,
or CONESTA for short, with a statically decreasing sequence of eps and mu.

Parameters
----------
mu_min : Non-negative float. A "very small" mu to use as a lower bound for
        mu.

tau : Float, 0 < tau < 1. The rate at which eps is decreasing. Default
        is 0.5.

exponent : Float, in [1.001, 2.0]. The assumed convergence rate of
        ||beta* - beta_k||_2 for k=1,2,... is O(1 / k^exponent). Default
        is 1.5.

eps : Positive float. Tolerance for the stopping criterion.

info : List or tuple of utils.Info. What, if any, extra run information
        should be stored. Default is an empty list, which means that no
        run information is computed nor returned.

max_iter : Non-negative integer. Maximum allowed number of iterations.

min_iter : Non-negative integer less than or equal to max_iter. Minimum
        number of iterations that must be performed. Default is 1.

Example
-------
>>> from parsimony.algorithms.proximal import StaticCONESTA
>>> from parsimony.functions.nesterov import l1tv
>>> from parsimony.functions import LinearRegressionL1L2TV
>>> import scipy.sparse as sparse
>>> import numpy as np
>>>
>>> np.random.seed(42)
>>> X = np.random.rand(100, 50)
>>> y = np.random.rand(100, 1)
>>> A = sparse.csr_matrix((50, 50))  # Unused here
>>> function = LinearRegressionL1L2TV(X, y, 0.0, 0.0, 0.0,
...                                   A=[A], mu=0.0)
>>> static_conesta = StaticCONESTA(max_iter=10000)
>>> beta1 = static_conesta.run(function, np.random.rand(50, 1))
>>> beta2 = np.dot(np.linalg.pinv(X), y)
>>> round(np.linalg.norm(beta1 - beta2), 13)
3.0183961e-06
>>>
>>> np.random.seed(42)
>>> X = np.random.rand(100, 50)
>>> y = np.random.rand(100, 1)
>>> A = sparse.csr_matrix((50, 50))
>>> function = LinearRegressionL1L2TV(X, y, 0.1, 0.0, 0.0,
...                                   A=[A], mu=0.0)
>>> static_conesta = StaticCONESTA(max_iter=10000)
>>> beta1 = static_conesta.run(function, np.random.rand(50, 1))
>>> beta2 = np.dot(np.linalg.pinv(X), y)
>>> round(np.linalg.norm(beta1 - beta2), 13)
0.8272329573827
>>> np.linalg.norm(beta2.ravel(), 0)
50
>>> np.linalg.norm(beta1.ravel(), 0)
7
>>>
>>> np.random.seed(42)
>>> X = np.random.rand(100, 50)
>>> y = np.random.rand(100, 1)
>>> A = l1tv.linear_operator_from_shape((1, 1, 50), 50)
>>> function = LinearRegressionL1L2TV(X, y, 0.1, 0.1, 0.1,
...                                   A=A, mu=0.0)
>>> static_conesta = StaticCONESTA(max_iter=10000)
>>> beta1 = static_conesta.run(function, np.zeros((50, 1)))
>>> beta2 = np.dot(np.linalg.pinv(X), y)
>>> round(np.linalg.norm(beta1 - beta2), 13)
0.9662907379987

Instance Methods [hide private]
 
__init__(self, mu_min=5e-08, tau=0.5, exponent=1.5, info=[], eps=5e-08, max_iter=10000, min_iter=1, simulation=False)
x.__init__(...) initializes x; see help(type(x)) for signature
source code
 
_harmonic_number_approx(self) source code
 
_approximate_eps(self, function, beta0) source code
 
run(self, function, *args, **kwargs)
This function obtains a minimiser of a give function.
source code

Inherited from bases.BaseAlgorithm: get_params, set_params

Inherited from bases.IterativeAlgorithm: iter_reset

Inherited from bases.InformationAlgorithm: check_info_compatibility, info_copy, info_get, info_provided, info_requested, info_reset, info_set

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __repr__, __setattr__, __sizeof__, __str__, __subclasshook__

Static Methods [hide private]

Inherited from bases.BaseAlgorithm: check_compatibility

Class Variables [hide private]
  INTERFACES = [<class 'parsimony.functions.properties.NesterovF...
  INFO_PROVIDED = ['ok', 'converged', 'num_iter', 'continuations...
  __abstractmethods__ = frozenset([])
  _abc_negative_cache_version = 14

Inherited from bases.ExplicitAlgorithm: __metaclass__

Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, mu_min=5e-08, tau=0.5, exponent=1.5, info=[], eps=5e-08, max_iter=10000, min_iter=1, simulation=False)
(Constructor)

source code 

x.__init__(...) initializes x; see help(type(x)) for signature

Overrides: object.__init__
(inherited documentation)

run(self, function, *args, **kwargs)

source code 
This function obtains a minimiser of a give function.

Parameters
----------
function : The function to minimise.

x : A starting point.

Decorators:
  • @bases.force_reset
  • @bases.check_compatibility
Overrides: bases.ExplicitAlgorithm.run
(inherited documentation)

Class Variable Details [hide private]

INTERFACES

Value:
[<class 'parsimony.functions.properties.NesterovFunction'>,
 <class 'parsimony.functions.properties.StepSize'>,
 <class 'parsimony.functions.properties.ProximalOperator'>,
 <class 'parsimony.functions.properties.Continuation'>,
 <class 'parsimony.functions.properties.DualFunction'>]

INFO_PROVIDED

Value:
['ok',
 'converged',
 'num_iter',
 'continuations',
 'time',
 'fvalue',
 'func_val',
 'mu']