Home | Trees | Indices | Help |
---|
|
object --+ | BaseEstimator --+ | LogisticRegressionEstimator --+ | ElasticNetLogisticRegression
Logistic regression (re-weighted log-likelihood aka. cross-entropy) with with L1 and L2 penalties: f(beta) = -loglik / n_samples + alpha * l * ||beta||_1 + alpha * ((1.0 - l) / 2) * ||beta||²_2, where loglik = Sum wi * (yi * log(pi) + (1 − yi) * log(1 − pi)), pi = p(y=1|xi, beta) = 1 / (1 + exp(-xi'*beta)), wi = weight of sample i, and ||.||²_2 is the squared L2-norm. Parameters ---------- l : Non-negative float. The L2 regularisation parameter. algorithm : ExplicitAlgorithm. The algorithm that should be applied. Should be one of: 1. GradientDescent(...) Default is GradientDescent(...). algorithm_params : A dict. The dictionary algorithm_params contains parameters that should be set in the algorithm. Passing algorithm=MyAlgorithm(**params) is equivalent to passing algorithm=MyAlgorithm() and algorithm_params=params. Default is an empty dictionary. class_weight : Dict, 'auto' or None. If 'auto', class weights will be given inverse proportional to the frequency of the class in the data. If a dictionary is given, keys are classes and values are corresponding class weights. If None is given, the class weights will be uniform. penalty_start : Non-negative integer. The number of columns, variables etc., to be exempt from penalisation. Equivalently, the first index to be penalised. Default is 0, all columns are included. mean : Boolean. Whether to compute the mean loss or not. Default is True, the mean loss. Examples --------
|
|||
|
|||
|
|||
|
|||
Inherited from Inherited from Inherited from |
|
|||
__abstractmethods__ =
|
|||
Inherited from Inherited from |
|
|||
Inherited from |
|
x.__init__(...) initializes x; see help(type(x)) for signature
|
Return a dictionary containing all the estimator's parameters.
|
Fit the estimator to the data.
|
Home | Trees | Indices | Help |
---|
Generated by Epydoc 3.0.1 on Mon Apr 6 23:52:10 2015 | http://epydoc.sourceforge.net |