mcmodels.regressors.nonnegative_ridge_regression¶
-
mcmodels.regressors.
nonnegative_ridge_regression
(X, y, alpha, sample_weight=None, solver='SLSQP', **solver_kwargs)[source]¶ Solve the nonnegative least squares estimate ridge regression problem.
Solves
\[\underset{x}{\text{argmin}} \| Ax - b \|_2^2 + \alpha^2 \| x \|_2^2 \quad \text{s.t.} \quad x \geq 0\]We can write this as the quadratic programming (QP) problem:
\[\underset{x}{\text{argmin}} x^TQx - c^Tx \quad \text{s.t.} \quad x \geq 0\]where
\[Q = A^TA + \alpha I \quad \text{and} \quad c = -2A^Ty\]Parameters: - X : array, shape = (n_samples, n_features)
Training data.
- y : array, shape = (n_samples,) or (n_samples, n_targets)
Target values.
- alpha : float or array with shape = (n_features,)
Regularization strength; must be a positive float. Improves the conditioning of the problem and reduces the variance of the estimates. Larger values specify stronger regularization.
- sample_weight : float or array-like, shape (n_samples,), optional (default = None)
Individual weights for each sample.
- solver : string, optional (default = ‘SLSQP’)
Solver with which to solve the QP. Must be one that supports bounds (i.e. ‘L-BFGS-B’, ‘TNC’, ‘SLSQP’).
- **solver_kwargs
See scipy.optimize.minimize for valid keyword arguments
Returns: - coef : array, shape = (n_features,) or (n_features, n_targets)
Weight vector(s).
- res : float
The residual, \(\| Qx - c \|_2\)
See also
Notes
- This is an experimental function.
- If one wishes to perform Lasso or Elastic-Net regression, see sklearn.linear_model.lasso_path or sklearn.linear_model.enet_path, and pass the parameters fit_intercept=False, positive=True