Package parsimony :: Package functions :: Module losses :: Class RidgeRegression
[hide private]
[frames] | no frames]

Class RidgeRegression

source code

                        object --+        
                                 |        
               properties.Function --+    
                                     |    
          properties.CompositeFunction --+
                                         |
                            object --+   |
                                     |   |
                   properties.Gradient --+
                                         |
                            object --+   |
                                     |   |
properties.LipschitzContinuousGradient --+
                                         |
                            object --+   |
                                     |   |
             properties.StronglyConvex --+
                                         |
                            object --+   |
                                     |   |
                   properties.StepSize --+
                                         |
                                        RidgeRegression

The Ridge Regression function, i.e. a representation of

    f(x) = (0.5 / n) * ||Xb - y||²_2 + lambda * 0.5 * ||b||²_2,

where ||.||²_2 is the L2 norm.

Instance Methods [hide private]
 
__init__(self, X, y, k, penalty_start=0, mean=True)
Parameters ---------- X : Numpy array (n-by-p).
source code
 
reset(self)
Free any cached computations from previous use of this Function.
source code
 
f(self, beta)
Function value.
source code
 
grad(self, beta)
Gradient of the function at beta.
source code
 
L(self)
Lipschitz constant of the gradient.
source code
 
lambda_min(*args, **kwargs)
Smallest eigenvalue of the corresponding covariance matrix.
source code
 
parameter(self)
Returns the strongly convex parameter for the function.
source code
 
step(self, beta, index=0)
The step size to use in descent methods.
source code

Inherited from properties.Function: get_params, set_params

Inherited from properties.Gradient: approx_grad

Inherited from properties.LipschitzContinuousGradient: approx_L

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __repr__, __setattr__, __sizeof__, __str__, __subclasshook__

Class Variables [hide private]
  __abstractmethods__ = frozenset([])

Inherited from properties.CompositeFunction: __metaclass__

Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, X, y, k, penalty_start=0, mean=True)
(Constructor)

source code 

Parameters
----------
X : Numpy array (n-by-p). The regressor matrix.

y : Numpy array (n-by-1). The regressand vector.

k : Non-negative float. The ridge parameter.

penalty_start : Non-negative integer. The number of columns, variables
        etc., to except from penalisation. Equivalently, the first
        index to be penalised. Default is 0, all columns are included.

mean : Boolean. Whether to compute the squared loss or the mean
        squared loss. Default is True, the mean squared loss.

Overrides: object.__init__

reset(self)

source code 

Free any cached computations from previous use of this Function.

From the interface "Function".

Overrides: properties.Function.reset

f(self, beta)

source code 
Function value.

From the interface "Function".

Parameters
----------
beta : Numpy array. Regression coefficient vector. The point at which
        to evaluate the function.

Overrides: properties.Function.f

grad(self, beta)

source code 
Gradient of the function at beta.

From the interface "Gradient".

Parameters
----------
beta : Numpy array. The point at which to evaluate the gradient.

Examples
--------
>>> import numpy as np
>>> from parsimony.functions.losses import RidgeRegression
>>>
>>> np.random.seed(42)
>>> X = np.random.rand(100, 150)
>>> y = np.random.rand(100, 1)
>>> rr = RidgeRegression(X=X, y=y, k=3.14159265)
>>> beta = np.random.rand(150, 1)
>>> round(np.linalg.norm(rr.grad(beta)
...       - rr.approx_grad(beta, eps=1e-4)), 9)
1.3e-08

Overrides: properties.Gradient.grad

L(self)

source code 

Lipschitz constant of the gradient.

From the interface "LipschitzContinuousGradient".

Overrides: properties.LipschitzContinuousGradient.L

lambda_min(*args, **kwargs)

source code 

Smallest eigenvalue of the corresponding covariance matrix.

From the interface "Eigenvalues".

Decorators:
  • @utils.deprecated("StronglyConvex.parameter")

parameter(self)

source code 

Returns the strongly convex parameter for the function.

From the interface "StronglyConvex".

Overrides: properties.StronglyConvex.parameter

step(self, beta, index=0)

source code 
The step size to use in descent methods.

Parameters
----------
beta : Numpy array. The point at which to determine the step size.

Overrides: properties.StepSize.step