Package parsimony :: Package functions :: Module losses :: Class LatentVariableVariance
[hide private]
[frames] | no frames]

Class LatentVariableVariance

source code

                            object --+    
                                     |    
                   properties.Function --+
                                         |
                            object --+   |
                                     |   |
                   properties.Gradient --+
                                         |
                            object --+   |
                                     |   |
                   properties.StepSize --+
                                         |
                            object --+   |
                                     |   |
properties.LipschitzContinuousGradient --+
                                         |
                                        LatentVariableVariance

Nested Classes [hide private]

Inherited from properties.Function: __metaclass__

Instance Methods [hide private]
 
__init__(self, X, unbiased=True)
x.__init__(...) initializes x; see help(type(x)) for signature
source code
 
reset(self)
Free any cached computations from previous use of this Function.
source code
 
f(self, w)
Function value.
source code
 
grad(self, w)
Gradient of the function.
source code
 
L(self)
Lipschitz constant of the gradient with given index.
source code
 
step(self, w, index=0)
The step size to use in descent methods.
source code

Inherited from properties.Function: get_params, set_params

Inherited from properties.Gradient: approx_grad

Inherited from properties.LipschitzContinuousGradient: approx_L

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __repr__, __setattr__, __sizeof__, __str__, __subclasshook__

Class Variables [hide private]
  __abstractmethods__ = frozenset([])
Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, X, unbiased=True)
(Constructor)

source code 

x.__init__(...) initializes x; see help(type(x)) for signature

Overrides: object.__init__
(inherited documentation)

reset(self)

source code 

Free any cached computations from previous use of this Function.

Overrides: properties.Function.reset
(inherited documentation)

f(self, w)

source code 
Function value.

From the interface "Function".

Examples
--------
>>> import numpy as np
>>> from parsimony.algorithms.nipals import FastSVD
>>> from parsimony.functions.losses import LatentVariableVariance
>>>
>>> np.random.seed(1337)
>>> X = np.random.rand(50, 150)
>>> w = np.random.rand(150, 1)
>>> var = LatentVariableVariance(X)
>>> round(var.f(w), 12)
-1295.854475188615
>>> round(-np.dot(w.T, np.dot(X.T, np.dot(X, w)))[0, 0] / 49.0, 12)
-1295.854475188615

Overrides: properties.Function.f

grad(self, w)

source code 
Gradient of the function.

From the interface "Gradient".

Parameters
----------
w : The point at which to evaluate the gradient.

Examples
--------
>>> import numpy as np
>>> from parsimony.functions.losses import LatentVariableVariance
>>>
>>> np.random.seed(42)
>>> X = np.random.rand(50, 150)
>>> var = LatentVariableVariance(X)
>>> w = np.random.rand(150, 1)
>>> np.linalg.norm(var.grad(w) - var.approx_grad(w, eps=1e-4)) < 5e-8
True

Overrides: properties.Gradient.grad

L(self)

source code 
Lipschitz constant of the gradient with given index.

From the interface "LipschitzContinuousGradient".

Examples
--------
>>> import numpy as np
>>> from parsimony.algorithms.nipals import FastSVD
>>> from parsimony.functions.losses import LatentVariableVariance
>>>
>>> np.random.seed(1337)
>>> X = np.random.rand(50, 150)
>>> w = np.random.rand(150, 1)
>>> var = LatentVariableVariance(X)
>>> round(var.L(), 10)
47025.0809786841
>>> _, S, _ = np.linalg.svd(np.dot(X.T, X))
>>> round(np.max(S) * 49 / 2.0, 10)
47025.0809786841

Overrides: properties.LipschitzContinuousGradient.L

step(self, w, index=0)

source code 
The step size to use in descent methods.

Parameters
----------
w : Numpy array. The point at which to determine the step size.

Examples
--------
>>> import numpy as np
>>> from parsimony.algorithms.nipals import FastSVD
>>> from parsimony.functions.losses import LatentVariableVariance
>>>
>>> np.random.seed(42)
>>> X = np.random.rand(50, 150)
>>> w = np.random.rand(150, 1)
>>> var = LatentVariableVariance(X)
>>> var.step(w)
2.1979627581251385e-05
>>> _, S, _ = np.linalg.svd(np.dot(X.T, X))
>>> round(1.0 / (np.max(S) * 49 / 2.0), 15)
2.1979627581e-05

Overrides: properties.StepSize.step