Package parsimony :: Module estimators :: Class LinearRegressionL1L2TV
[hide private]
[frames] | no frames]

Class LinearRegressionL1L2TV

source code

     object --+        
              |        
  BaseEstimator --+    
                  |    
RegressionEstimator --+
                      |
                     LinearRegressionL1L2TV

Linear regression with L1, L2 and TV penalties:

    f(beta, X, y) = (1 / (2 * n)) * ||Xbeta - y||²_2
                    + l1 * ||beta||_1
                    + (l2 / 2) * ||beta||²_2
                    + tv * TV(beta)

Parameters
----------
l1 : Non-negative float. The L1 regularization parameter.

l2 : Non-negative float. The L2 regularization parameter.

tv : Non-negative float. The total variation regularization parameter.

A : Numpy or (usually) scipy.sparse array. The linear operator for the
        smoothed total variation Nesterov function. A must be given.

mu : Non-negative float. The regularisation constant for the smoothing.

algorithm : ExplicitAlgorithm. The algorithm that should be applied.
        Should be one of:
            1. CONESTA(...)
            2. StaticCONESTA(...)
            3. FISTA(...)
            4. ISTA(...)
            5. ADMM(...)
            6. NaiveCONESTA(...)

        Default is CONESTA(...).

algorithm_params : A dict. The dictionary algorithm_params contains
        parameters that should be set in the algorithm. Passing
        algorithm=CONESTA(**params) is equivalent to passing
        algorithm=CONESTA() and algorithm_params=params. Default is an
        empty dictionary.

penalty_start : Non-negative integer. The number of columns, variables
        etc., to be exempt from penalisation. Equivalently, the first
        index to be penalised. Default is 0, all columns are included.

mean : Boolean. Whether to compute the squared loss or the mean squared
        loss. Default is True, the mean squared loss.

rho : Positive float. Regularisation constant only used in ADMM. Default
        is 1.0.

Examples
--------
>>> import numpy as np
>>> import parsimony.estimators as estimators
>>> import parsimony.algorithms.proximal as proximal
>>> import parsimony.functions.nesterov.tv as total_variation
>>> shape = (1, 4, 4)
>>> n = 10
>>> p = shape[0] * shape[1] * shape[2]
>>>
>>> np.random.seed(42)
>>> X = np.random.rand(n, p)
>>> y = np.random.rand(n, 1)
>>> l1 = 0.1  # L1 coefficient
>>> l2 = 0.9  # Ridge coefficient
>>> tv = 1.0  # TV coefficient
>>> A, n_compacts = total_variation.linear_operator_from_shape(shape)
>>> lr = estimators.LinearRegressionL1L2TV(l1, l2, tv, A,
...                      algorithm=proximal.StaticCONESTA(max_iter=1000),
...                      mean=False)
>>> res = lr.fit(X, y)
>>> round(lr.score(X, y), 13)
0.0683842576534
>>>
>>> lr = estimators.LinearRegressionL1L2TV(l1, l2, tv, A,
...                     algorithm=proximal.CONESTA(max_iter=1000),
...                     mean=False)
>>> res = lr.fit(X, y)
>>> round(lr.score(X, y), 13)
0.0683583406798
>>>
>>> lr = estimators.LinearRegressionL1L2TV(l1, l2, tv, A,
...                                algorithm=proximal.FISTA(max_iter=1000),
...                                mean=False)
>>> lr = lr.fit(X, y)
>>> round(lr.score(X, y), 13)
1.5817577127184
>>>
>>> lr = estimators.LinearRegressionL1L2TV(l1, l2, tv, A,
...                                 algorithm=proximal.ISTA(max_iter=1000),
...                                 mean=False)
>>> lr = lr.fit(X, y)
>>> round(lr.score(X, y), 14)
2.07583068899674
>>>
>>> import parsimony.functions.nesterov.l1tv as l1tv
>>> np.random.seed(1337)
>>> A = l1tv.linear_operator_from_shape(shape, p, penalty_start=0)
>>> lr = estimators.LinearRegressionL1L2TV(l1, l2, tv, A,
...                                 algorithm=proximal.ADMM(max_iter=1000),
...                                 mean=False)
>>> lr = lr.fit(X, y)
>>> round(lr.score(X, y), 13)
0.0623552412543

Instance Methods [hide private]
 
__init__(self, l1, l2, tv, A=None, mu=5e-08, algorithm=None, algorithm_params={}, penalty_start=0, mean=True, rho=1.0)
x.__init__(...) initializes x; see help(type(x)) for signature
source code
 
get_params(self)
Return a dictionary containing all the estimator's parameters
source code
 
fit(self, X, y, beta=None)
Fit the estimator to the data.
source code
 
score(self, X, y)
Return the mean squared error of the estimator.
source code

Inherited from RegressionEstimator: predict

Inherited from BaseEstimator: get_info, parameters, set_params

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __repr__, __setattr__, __sizeof__, __str__, __subclasshook__

Class Variables [hide private]
  __abstractmethods__ = frozenset([])

Inherited from RegressionEstimator: __metaclass__

Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, l1, l2, tv, A=None, mu=5e-08, algorithm=None, algorithm_params={}, penalty_start=0, mean=True, rho=1.0)
(Constructor)

source code 

x.__init__(...) initializes x; see help(type(x)) for signature

Overrides: object.__init__
(inherited documentation)

get_params(self)

source code 

Return a dictionary containing all the estimator's parameters

Overrides: BaseEstimator.get_params

fit(self, X, y, beta=None)

source code 

Fit the estimator to the data.

Overrides: BaseEstimator.fit

score(self, X, y)

source code 

Return the mean squared error of the estimator.

Overrides: BaseEstimator.score