Package 'lmridge'

Title: Linear Ridge Regression with Ridge Penalty and Ridge Statistics
Description: Linear ridge regression coefficient's estimation and testing with different ridge related measures such as MSE, R-squared etc. REFERENCES i. Hoerl and Kennard (1970) <doi:10.1080/00401706.1970.10488634>, ii. Halawa and El-Bassiouni (2000) <doi:10.1080/00949650008812006>, iii. Imdadullah, Aslam, and Saima (2017), iv. Marquardt (1970) <doi:10.2307/1267205>.
Authors: Imdad Ullah Muhammad [aut, cre] , Aslam Muhammad [aut, ctb]
Maintainer: Imdad Ullah Muhammad <[email protected]>
License: GPL (>= 2.0)
Version: 1.2.2
Built: 2025-02-18 05:25:53 UTC
Source: https://github.com/cran/lmridge

Help Index


Linear Ridge Regression

Description

R package for fitting linear ridge regression models.

Details

This package contains functions for fitting linear ridge regression models, including functions for computation of different ridge related statistics (such as MSE, Var-Cov matrix, effective degrees of freedom and condition numbers), estimation of biasing parameter from different researchers, testing of ridge coefficients, model selection criteria, residuals, predicted values and fitted values. The package also includes function for plotting of ridge coefficients and different ridge statistics for selection of optimal value of biasing parameters KK.

For a complete list of functions, use library(help="lmridge").

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam


Bias Variance and MSE Trade-off Plot

Description

Trade-off between bias, variance and MSE of the linear ridge regression against vector or scalar value of biasing parameter KK (see Kalivas and Palmer, 2014 <doi:10.1002/cem.2555>).

Usage

bias.plot(x, abline = TRUE, ...)

Arguments

x

An object of class "lmridge".

abline

Horizontal and vertical lines show the minimum value of the ridge MSE at certain value of biasing parameter KK.

...

Not presently used in this implementation.

Details

The effect of multicollinearity on the coefficient estimates can be identified using different graphical display. One of them is plot of bias, variance and MSE. A little addition of bias lead to a substantial decrease in variance, and MSE. Therefore, a trade-off is made between bias and variance to have acceptable MSE. The bias.plot can be helpful for selection of optimal value of biasing parameter KK.

Value

Nothing returned

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Kalivas, J. H., and Palmer, J. (2014). Characterizing multivariate calibration tradeoffs (bias, variance, selectivity, and sensitivity) to select model tuning parameters. Journal of Chemometrics, 28(5), 347–357. doi:10.1002/cem.2555.

See Also

The ridge model fitting lmridge, ridge CV and GCV plots cv.plot, ridge AIC and BIC plots info.plot, m-scale and isrm plots isrm.plot, ridge and VIF trace plot.lmridge, miscellaneous ridge plots rplots.plot

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.3, 0.002))
## for indication vertical line (biasing parameter k) and
## horizontal line (minimum minimum ridge MSE values corresponding to vertical line)
bias.plot(mod)

## without Horizontal and vertical line as set \code{abline = FALSE}
bias.plot(mod, abline=FALSE)

Ridge CV and GCV Plot

Description

Plot of ridge CV and GCV against scalar or vector values of biasing parameter KK (see Golub et al., 1979 <doi:10.1080/00401706.1979.10489751>).

Usage

cv.plot(x, abline = TRUE, ...)

Arguments

x

An object of class "lmridge".

abline

Horizontal and vertical lines to show minimum value of ridge GCV and CV at certain value of biasing parameter KK.

...

Not presently used in this implementation.

Details

Function cv.plot can be used to plot the values of ridge CV and GCV against scalar or vector value of biasing parameter KK. The cv.plot can be helpful for selection of optimal value of ridge biasing parameter KK. If no argument is used then horizontal line will indicate minimum GCV and Cv at certain value of biasing parameter KK.

Value

Nothing returned

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Delaney, N. J. and Chatterjee, S. (1986). Use of the Bootstrap and Cross-Validation in Ridge Regression. Journal of Business & Economic Statistics. 4(2), 255–262.

Golub, G., Wahba, G. and Heat, C. (1979). Generalized Cross Validation as a Method for Choosing a Good Ridge Parameter. Technometrics. 21, 215–223. doi:10.2307/1268518.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

See Also

The ridge model fitting lmridge, bias variance trade-off plot bias.plot, ridge AIC and BIC plots info.plot, m-scale and isrm plots isrm.plot, ridge and VIF trace plot.lmridge, miscellaneous ridge plots rplots.plot

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.2, 0.002))
## for indication vertical line (biasing parameter k) and
## horizontal line (minimum respective CV and GCV values corresponding to vertical line)
cv.plot(mod)

## without Horizontal and vertical line set \code{abline = FALSE}
cv.plot(mod, abline = FALSE)

Portland Cement benchmark of Hald(1952)

Description

Heat evolved during setting of 13 cement mixtures of four basic ingredients. Each ingredient percentage appears to be rounded down to a full integer. The sum of the four mixture percentages varies from a maximum of 99% to a minimum of 95%. If all four regressor X-variables always summed to 100%, the centered X-matrix would then be of rank only 3. Thus, the regression of heat on four X-percentages is ill-conditioned, with an approximate rank deficiency of MCAL = 1.

Usage

data(Hald)

Format

A data frame with 13 observations on the following 5 variables.

X1

p3ca: Integer percentage of 3CaO.Al2O3 in the mixture.

X2

p3cs: Integer percentage of 3CaO.SiO2 in the mixture.

X3

p4caf: Integer percentage of 4CaO.Al2O3.Fe2O3 in the mixture.

X4

p2cs: Integer percentage of 2CaO.SiO2 in the mixture.

y

hear: Heat (cals/gm) evolved in setting, recorded to nearest tenth.

Details

The (lmridge) Hald data are identical to the (MASS) cement data except for variable names.

Source

Woods, H., Steinour, H.H. and Starke, H.R. (1932). Effect of Composition of Portland Cement on Heat Evolved During Hardening. Industrial Engineering and Chemistry 24: 1207–1214.

References

Hald, A. (1952). Statistical Theory with Engineering Applications.(page 647.) New York; Wiley.


Ridge Regression: Hat Matrix

Description

The hatr function computes hat matrix (see Hastie and Tibshirani, 1990).

Usage

hatr(x, ...)
## S3 method for class 'lmridge'
hatr(x, ...)

Arguments

x

An object of class "lmridge".

...

Not presently used in this implementation.

Details

Hat matrix for scalar or vector values of biasing parameter provided as argument to lmridge. It is used to compute degrees of freedom for given KK, and error degree of freedom etc. The hat matrix can be computed using formula X(XX+kI)1XX(X'X+kI)^{-1}X' equivalently λj(λj+k)\sum{\frac{\lambda_j}{(\lambda_j+k)}}.

Value

returns a list of matrix for each biasing parameter KK:

hatr

A list of hat matrix for each biasing parameter KK

.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Cule, E. and De lorio, M. (2012). A semi-Automatic method to guide the choice of ridge parameter in ridge regression. arXiv:abs/1205.0686v1 [stat.AP].

Hastie, T. and Tibshirani, R. (1990). Generalized Additive Models. Chapman and Hall.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

See Also

The ridge model fitting lmridge, ridge Var-Cov matrix vcov.lmridge

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = c(0, 0.1, 0.2, 0.3))
## Hat matrix for each biasing parameter
hatr(mod)

## Hat matrix for first biasing parameter i.e. K = 0.1
hatr(mod)[[2]]

Model Selection Criteria Plots

Description

Plot of ridge AIC and BIC model selection criteria against ridge degrees of freedom (see Akaike, 1974 <doi:10.1109/TAC.1974.1100705>; Imdad, 2017 and Schwarz, 1978 <doi:10.1214/aos/1176344136>).

Usage

info.plot(x, abline = TRUE, ...)

Arguments

x

An object of class "lmridge".

abline

Vertical line to show minimum value of ridge MSE at certain value of ridge degrees of freedom.

...

Not presently used in this implementation.

Details

Plot of ridge AIC and BIC against ridge degress of freedom j=1pλjλj+k\sum_{j=1}^p \frac{\lambda_j}{\lambda_j+k}. A vertical line represents the minimum ridge MSE at certain value of ridge df.

Value

Nothing returned

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Akaike, H. (1974). A new look at the Statistical Model Identification. IEEE Transaction on Automatic Control, 9(6), 716–723. doi:10.1109/TAC.1974.1100705.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Schwarz, G. (1978). Estimating the Dimension of a Model. Annals of Statistics, 6(2), 461–464. doi:10.1214/aos/1176344136.

See Also

The ridge model fitting lmridge, ridge CV and GCV plotcv.plot, variance biase trade-off plot bias.plot, m-scale and isrm plots isrm.plot, ridge and VIF trace plot.lmridge, miscellaneous ridge plots rplots.plot

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.15, 0.002))
## for indication vertical line (df ridge)
info.plot(mod)

## without vertical line set \code{abline = FALSE}
info.plot(mod, abline = FALSE)

Model Selection Criteria for Ridge Regression

Description

The infocr.lmridge function computes model information selection criteria (AIC and BIC), see Akaike, 1974 <doi:10.1109/TAC.1974.1100705>; Imdad, 2017 and Schwarz, 1978 <doi:10.1214/aos/1176344136>.

Usage

infocr(object, ...)
## S3 method for class 'lmridge'
infocr(object, ...)

Arguments

object

An object of class "lmridge".

...

Not presently used in this implementation.

Details

Model information selection criteria are common way of selecting among model while balancing the competing goals of fit and parsimony. The model selection criteria AIC and BIC are computed by quantifying df in the ridge regression model, using formula (df=trace[X(XX+kI)1X]df=trace[X(X'X+kI)^{-1}X']). It can be helpful for selecting optimal value of biasing parameter KK.

Value

It returns a matrix of information criteria, AIC and BIC for each biasing parameter KK. Column of matrix indicates model selection criteria AIC and BIC, respectively, while rows indicate value of biasing parameter KK for which model selection criteria are computed.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Akaike, H. (1974). A new look at the Statistical Model Identification. IEEE Transaction on Automatic Control, 9(6), 716-723. doi:10.1109/TAC.1974.1100705.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Schwarz, G. (1978). Estimating the Dimension of a Model. Annals of Statistics, 6(2), 461–464. doi:10.1214/aos/1176344136.

See Also

the ridge model fitting lmridge, ridge AIC and BIC plot info.plot

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, .2, 0.001))
infocr(mod)

## Vector of AIC values
infocr(mod)[,1]

## vector of BIC values
infocr(mod)[,2]

ISRM and m-scale Plot

Description

Plot of m-scale and ISRM against scalar or vector values of biasing parameter KK (Vinod, 1976 <doi:10.1080/01621459.1976.10480955>).

Usage

isrm.plot(x, ...)

Arguments

x

An object of class "lmridge".

...

Not presently used in this implementation.

Details

The isrm.plot function can be used to plot the values of m-scale and ISRM against given list (scalar or vector values) of biasing parameter KK as argument to lmridge. It can be helpful for the optimal selection of the biasing parameter KK.

Value

Nothing returned

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Vinod, H. (1976). Application of New Ridge Regression Methods to a Study of Bell System Scale Economics. Journal of the American Statistical Association, 71, 835–841. doi:10.2307/2286847.

See Also

The ridge model fitting lmridge, ridge CV and GCV plots cv.plot, ridge AIC and BIC plots info.plot, variance bias trade-off plot bias.plot, ridge and VIF trace plot.lmridge, miscellaneous ridge plotsrplots.plot

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.2, 0.002))

isrm.plot(mod)
isrm.plot(mod, abline=FALSE)

Computation of Ridge Biasing Parameter KK

Description

The kest function computes different biasing parameters available in the literature proposed by different researchers.

Usage

kest(object, ...)
## S3 method for class 'lmridge'
kest(object, ...)
## S3 method for class 'klmridge'
print(x, ...)

Arguments

object

An object of class "lmridge" for the kest.

x

An object of class "klmridge" for the print.kest.klmridge.

...

Not presently used in this implementation.

Details

The kest function computes different biasing parameter for the ordinary linear ridge regression. All these methods are already available in the literature and proposed by various authors. See reference section.

Value

The function returns the list of following biasing parameter methods proposed by various researchers.

mHKB

By Thisted (1976), ((p2)σ^2)(β2)\frac{((p-2)*\hat{\sigma}^2)}{\sum(\beta^2)}

LW

As in lm.ridge of MASS ((p2)σ^2n)(y^2)\frac{((p-2)*\hat{\sigma}^2*n)}{\sum(\hat{y}^2)}

LW76

By Lawless and Wang (1976), pσ^22(λjα^j2)\frac{p*\hat{\sigma}^22}{\sum(\lambda_j*\hat{\alpha}_j^2)}

CV

Value of Cross Validation (CV) for each biasing parameter KK, CVk=1nj=1n(yiXjβ^jK)2CV_k=\frac{1}{n}\sum_{j=1}^n (y_i-X_j \hat{\beta}_{j_K})^2.

kCV

Value of biasing parameter at which CV is small.

HKB

By Hoerl and Kennard (1970), pσ^2β^β^\frac{p*\hat{\sigma}^2}{\hat{\beta}'\hat{\beta}}

kibAM

By Kibria (2003), 1p(σ^2β^j2)\frac{1}{p}*\sum(\frac{\hat{\sigma}^2}{\hat{\beta}_j^2)}

GCV

Value of Generalized Cross Validation (GCV) for each biasing parameter KK, (yiXjβ^JK)2[n(1+Trace(HR,k))]2\frac{(y_i-X_j\hat{\beta}_{J_K})^2}{[n-(1+Trace(H_{R,k}))]^2}.

kcGCV

Value of biasing parameter at which GCV is small.

DSK

By Dwividi and Shrivastava, (1978), σ^2β^β^\frac{\hat{\sigma}^2}{\hat{\beta}'\hat{\beta}}

kibGM

By Kibria (2003), σ^2((α^j2))(1/p)\frac{\hat{\sigma}^2}{(\prod(\hat{\alpha}_j^2))^(1/p)}

kibMEd

By Kibria (2003), median(σ^2α^j2)median(\frac{\hat{\sigma}^2}{\hat{\alpha}_j^2})

KM2

By Muniz and Kibria (2009), max[1σ^2α^j2]max[\frac{1}{\sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}^2_j}}}]

KM3

By Muniz and Kibria (2009), max[σ^2α^j2]max[\sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}_j^2}}]

KM4

By Muniz and Kibria (2009), [1σ^2α^j2]1p[\prod\frac{1}{\sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}_j^2}}}]^\frac{1}{p}

KM5

By Muniz and Kibria (2009), [σ^2α^j2]1p[\prod \sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}_j^2}}]^{\frac{1}{p}}

KM6

By Muniz and Kibria (2009), Median[1σ^2α^j2]Median[\frac{1}{\sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}^2_j}}}]

KM8

By Muniz et al. (2012), max(1λmaxσ^2(np)σ^2+λmaxα^j2)max(\frac{1}{\sqrt{\frac{\lambda_{max} \hat{\sigma}^2} {(n-p)\hat{\sigma}^2+\lambda_{max}\hat{\alpha}^2_j}}})

KM9

By Muniz et al. (2012), max[λmaxσ^2(np)σ^2+λmaxα^j2]max[\sqrt{\frac{\lambda_{max}\hat{\sigma}^2}{(n-p)\hat{\sigma}^2}+\lambda_{max}\hat{\alpha}^2_j}]

KM10

By Muniz et al. (2012), [(1λmaxσ^2(np)σ^2+λmaxα^j2)]1p[\prod(\frac{1}{\sqrt{\frac{\lambda_{max}\hat{\sigma}^2}{(n-p)\hat{\sigma}^2+\lambda_{max}\hat{\alpha}^2_j}}})]^{\frac{1}{p}}

KM11

By Muniz et al. (2012), [(λmaxσ^2(np)σ^2+λmaxα^j2)1p[\prod(\sqrt{\frac{\lambda_{max}\hat{\sigma}^2}{(n-p) \hat{\sigma}^2+\lambda_{max}\hat{\alpha}^2_j}})^{\frac{1}{p}}

KM12

By Muniz et al., Median[1λmaxσ^2(np)σ^2+λmaxα^j2]Median[\frac{1}{\sqrt{\frac{\lambda_{max}\hat{\sigma}^2}{(n-p)\hat{\sigma}^2+\lambda_{max}\hat{\alpha}^2_j}}}]

KD

By Dorugade and Kashid (2012), 0,pσ^2α^α^1n(VIFj)max0, \frac{p\hat{\sigma}^2}{\hat{\alpha}'\hat{\alpha}}-\frac{1}{n(VIF_j)_{max}}

KAD4

By Dorugade and Kashid (2012), HM[2pλmax(σ^2α^j2)]HM[\frac{2p}{\lambda_{max}} \sum(\frac{\hat{\sigma}^2}{\hat{\alpha}^2_j})]

alphahat

The OLS estimator in canonical form, i.e., α^=(PXXP)1Xy\hat{\alpha}=(P'X'XP)^{-1}X'^*y, where X=XPX^*=XP PP is eigenvector of XXX'X.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Dorugade, A. and Kashid, D. (2010). Alternative Method for Choosing Ridge Parameter for Regression. Applied Mathematical Sciences, 4(9), 447-456.

Dorugade, A. (2014). New Ridge Parameters for Ridge Regression. Journal of the Association of Arab Universities for Basic and Applied Sciences, 15, 94-99. doi:10.1016/j.jaubas.2013.03.005.

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Kibria, B. (2003). Performance of Some New Ridge Regression Estimators. Communications in Statistics-Simulation and Computation, 32(2), 491-435. doi:10.1081/SAC-120017499.

Lawless, J., and Wang, P. (1976). A Simulation Study of Ridge and Other Regression Estimators. Communications in Statistics-Theory and Methods, 5(4), 307-323. doi:10.1080/03610927608827353.

Muniz, G., and Kibria, B. (2009). On Some Ridge Regression Estimators: An Empirical Comparisons. Communications in Statistics-Simulation and Computation, 38(3), 621-630. doi:10.1080/03610910802592838.

Muniz, G., Kibria, B., Mansson, K., and Shukur, G. (2012). On developing Ridge Regression Parameters: A Graphical Investigation. SORT-Statistics and Operations Research Transactions, 36(2), 115–138.

Thisted, R. A. (1976). Ridge Regression, Minimax Estimation and Empirical Bayes Methods. Technical Report 28, Division of Biostatistics, Stanford University, California.

Venables, W. N. and Ripley, B. D. (2002). Modern Applied Statistics with S. Springer New York, 4th edition, ISBN 0-387-95457-0.

See Also

The ridge model fitting lmridge, Ridge Var-Cov matrix vcov

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.2, 0.001))
kest(mod)

## GCV values
kest(mod)$GCV

## minimum GCV value at certain k
kest(mod)$kGCV

## CV Values
kest(mod)$CV

## minimum CV value at certain k
kest(mod)$kCV

## Hoerl and Kennard (1970)
kest(mod)$HKB

Linear Ridge Regression

Description

Fits a linear ridge regression model after scaling regressors and returns an object of class "lmridge" (by calling lmridgeEst function) designed to be used in plotting method, testing of ridge coefficients and for computation of different ridge related statistics. The ridge biasing parameter KK can be a scalar or a vector. See Hoerl et al., 1975 <doi:10.1080/03610927508827232>, Horel and Kennard, 1970 <doi:10.1080/00401706.1970.10488634>.

Usage

lmridge(formula, data, K = 0, scaling=c("sc", "scaled", "non", "centered"), ...)
lmridgeEst(formula, data, K=0, scaling=c("sc", "scaled", "non", "centered"), ...)
## Default S3 method:
lmridge(formula, data, K = 0, scaling=c("sc", "scaled", "non", "centered"), ...)
## S3 method for class 'lmridge'
coef(object, ...)
## S3 method for class 'lmridge'
print(x, digits = max(5,getOption("digits") - 5), ...)
## S3 method for class 'lmridge'
fitted(object, ...)

Arguments

formula

Standard R formula expression, that is, a symbolic representation of the model to be fitted and has form response~predictors. For further details, see formula.

data

An optional data frame containing the variables in the model. If not found in data, the variables are taken from environment(formula), typically the environment from which lmridge or lmridgeEst is called.

K

Ridge biasing parameter (may be a vector).

scaling

The method to be used to scale the predictors. The scaling option "sc" scales the predictors to correlation form, such that the correlation matrix has unit diagonal elements. "scaled" option standardizes the predictors to have zero mean and unit variance. "non" no scaling or centering is done to predictors. "centered" option centers the predictors.

object

A lmridge object, typically generated by a call to lmridge for fitted.lmridge, predict.lmridge, vcov.lmridge, residuals.lmridge, infocr.lmridge, coef.lmridge, summary.lmridge and press.lmridge functions.

x

An object of class lmridge (for the hatr.lmridge, rstats1.lmridge, rstats2.lmridge, vif.lmridge, kest.lmridge, summary.lmride, print.lmridge, print.summary.lmridge, print.klmridge, print.rstats1, print.rstats2, and plot.lmridge, bias.plot, cv.plot, info.plot, isrm.plot, and rplots.plot functions).

digits

Minimum number of significant digits to be used.

...

Additional arguments to be passed to or from other methods.

Details

lmridge or lmridgeEst function fits in linear ridge regression after scaling the regressors and centering the response. The lmridge is default a function that calls lmridgeEst for computation of ridge coefficients and returns an object of class "lmridge" designed to be used in plotting method, testing of ridge coefficients and for computation of different ridge related statistics. If intercept is present in the model, its coefficient is not penalized. However, intercept is estimated from the relation y=yβXy=\overline{y}-\beta \overline{X}. print.lmridge tries to be smart about formatting of ridge coefficients.

Value

lmridge function returns an object of class "lmridge" after calling list of named objects from lmridgeEst function:

coef

A named vector of fitted coefficients.

call

The matched call.

Inter

Was an intercept included?

scaling

The scaling method used.

mf

Actual data used.

y

The response variable.

xs

The scaled matrix of predictors.

xm

The vector of means of the predictors.

terms

The terms object used.

xscale

Square root of sum of squared deviation from mean regarding the scaling option used in lmridge or lmridgeEst function as argument.

rfit

The fitted value of ridge regression for given biasing parameter KK.

K

The ridge regression biasing parameter KK which can be scalar or a vector.

d

A vector of singular values of scaled X matrix.

div

Eigenvalues of scaled regressors.

Z

A list of matrix (XX+KI)1X(X'X+KI)^{-1}X' for further computations.

Note

The function at the current form cannot handle missing values. The user has to take prior action with missing values before using this function.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.2307/1267351.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

See Also

Testing of ridge coefficient summary.lmridge

Examples

data(Hald)
mod <- lmridge(y~., data = as.data.frame(Hald), K = seq(0, 0.1, 0.01), scaling = "sc")
## Scaled Coefficients
mod$coef

## Re-Scaled Coefficients
coef(mod)

## ridge predicted values
predict(mod)

## ridge residuals
residuals(mod)

##ridge and VIF trace
plot(mod)

## ridge VIF values
vif(mod)

## ridge Var-Cov matrix
vcov(mod)

## ridge biasing parameter by researchers
kest(mod)

## ridge fitted values
fitted(mod)

## ridge statistics 1
rstats1(mod)

## ridge statistics 2
rstats2(mod)

## list of objects from lmridgeEst function
lmridgeEst(y~., data = as.data.frame(Hald), K = seq(0, 0.1, 0.01), scaling = "sc")

lmridgeEst(y~., data = as.data.frame(Hald), K = seq(0, 0.1, 0.01), scaling = "non")

VIF and Ridge Trace Plot

Description

Plot of VIF values (VIF trace) and ridge coefficients (ridge trace) for scalar or vector values of biasing parameter KK.

Usage

## S3 method for class 'lmridge'
plot(x, type = c("ridge", "vif"), abline = TRUE, ...)

Arguments

x

An object of class "lmridge".

type

Either VIF trace or ridge trace.

abline

Horizontal and vertical line to show minimum value of MSE and GCV value at certain value of biasing parameter KK on ridge and VIF trace respectively.

...

Not presently used in this implementation.

Details

Graphical way of selecting optimal value of biasing parameter KK. The biasing parameter is selected when coefficients becomes stable in case of ridge trace. In cae of VIF trace KK (ridge biasing parameter) can be selected for which VIF of each regressor near to one or value of KK at which GCV is minimum. If no argument is used then all traces of ridge coefficients will be displayed. A vertical and horizontal line will also be displayed on ridge trace graph to indicate minimum ridge MSE (among the all computed ridge MSE based on provided vector of KK) along with the value of respective biasing parameter KK. For VIF trace, vetical line shows minmum GCV value at certain value of biasing parameter KK.

Value

Nothing

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

See Also

The ridge model fitting lmridge, ridge CV and GCV plots cv.plot, variance bias trade-off plot bias.plot, m-scale and isrm plots isrm.plot, ridge AIC and BIC plots info.plot, miscellaneous ridge plots rplots.plot

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.15, 0.002))
## Ridge trace
plot(mod)
plot(mod, type = "ridge")

## VIF trace
plot(mod, type = "vif")
## Ridge trace without abline
plot(mod, type = "ridge", abline = FALSE)

Predict method for Linear Ridge Model Fits

Description

Predicted values based on linear ridge regression model for scalar or vector values of biasing parameter KK.

Usage

## S3 method for class 'lmridge'
predict(object, newdata, na.action=na.pass, ...)

Arguments

object

An object of class "lmridge".

newdata

An optional data frame in which to look for variables with which to predict.

na.action

Function determine what should be done with missing values in newdata. The default is to predict NA.

...

Not presently used in this implementation.

Details

The predict.lmridge function produces predicted values, obtained by evaluating the regression function in the frame newdata which defaults to model.frame (object). If newdata is omitted the predictions are based on the data used for the fit. In that case how cases with missing values in the original fit are handled is determined by the na.action argument of that fit. If na.action = na.omit omitted cases will not appear in the predictions, whereas if na.action = na.exclude they will appear (in predictions), with value NA.

Value

predict.lmridge produces a vector of predictions or a matrix of predictions for scalar or vector values of biasing parameter.

Note

Variables are first looked for in newdata and then are searched for in the usual way (which will include the environment of the formula used in the fit). A warning will be given if the variables found are not of the same length as those in the newdata if it was supplied.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Cule, E. and De lorio, M. (2012). A semi-Automatic method to guide the choice of ridge parameter in ridge regression. arXiv:1205.0686v1 [stat.AP].

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

See Also

The ridge model fitting lmridge, ridge residuals residuals, ridge PRESS press.lmridge

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.2, 0.05))
predict(mod)
predict(mod, newdata = as.data.frame(Hald[1:5, -1]))

Predicted Residual Sum of Squares

Description

The press.lmridge function computes predicted residual sum of squares (PRESS) (see Allen, 1971).

Usage

press(object, ...)
## S3 method for class 'lmridge'
press(object, ...)

Arguments

object

An object of class "lmridge".

...

Not presently used in this implementation.

Details

All of the n leave-one-out predicted residual sum of squares is calculated by fitting full regression model by using, e^i,k11nHiiR,k\sum\frac{\hat{e}_{i,k}}{1-\frac{1}{n}-H_{ii_{R,k}}}, where HiiR,kH_{ii_{R,k}} is hat matrix from ridge model fit, ei,k^\hat{e_{i,k}} is the ith residual at specific value of KK.

Value

The press.lmridge produces a vector of PRESS or a matrix of PRESS for scalar or vector values of biasing parameter.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Allen, D. M. (1971). Mean Square Error of Prediction as a Criterion for Selecting Variables. Technometrics, 13, 469-475. doi:10.1080/00401706.1971.10488811.

Allen, D. M. (1974). The Relationship between Variable Selection and Data Augmentation and Method for Prediction. Technometrics, 16, 125-127. doi:10.1080/00401706.1974.10489157.

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

See Also

The ridge model fitting lmridge, ridge residual residuals, ridge predicted value predict

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.5, 0.04))
press(mod)

Ridge Regression Residuals

Description

The residuals function computes the ridge residuals for scalar or vector value of biasing parameter KK.

Usage

## S3 method for class 'lmridge'
residuals(object, ...)

Arguments

object

An object of class "lmridge".

...

Not presently used in this implementation.

Details

The generic functions residuals can be used to compute residuals object of linear ridge regression from lmridge function.

Value

Returns a vector or a matrix of ridge residuals for scalar or vector value biasing parameter KK provided as argument to lmridge function.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Berk, R. (2008). Statistical Learning from a Regression Perspective. Springer.

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Lee, W. F. (1979). Model Estimation Using Ridge Regression with the Variance Normalization Criterion. Master thesis, Department of Educational Foundation, Memorial University of Newfoundland.

See Also

The ridge mode fitting lmridge, ridge prediction predict, ridge PRESS values press

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 1, 0.2))
residuals(mod)

Miscellaneous Ridge Plots

Description

Panel of three ridge related plots, df trace vs KK, RSS vs KK and PRESS vs KK for graphical judgement of optimal value of KK.

Usage

rplots.plot(x, abline = TRUE, ...)

Arguments

x

An object of class "lmridge"

abline

Vertical line to show minimum value of ridge PRESS at cartain value of biasing parameter KK on PRESS vs KK plot.

...

Not presently used in this implementation.

Details

Function rplots.plot can be used to plot the values of df vs KK, RSS vs KK and PRESS vs KK for scalar or vector values of biasing parameter KK. If no argument is used then a vertical line will be drawn on ridge PRESS plot to show the minimum value of PRESS at certain KK. The panel of these three plots can be helful in selecting the optimal value of biasing parameter KK.

Value

nothing

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Allen, D. M. (1971). Mean Square Error of Prediction as a Criterion for Selecting Variables. Technometrics, 13, 469-475. doi:10.1080/00401706.1971.10488811.

Allen, D. M. (1974). The Relationship between Variable Selection and Data Augmentation and Method for Prediction. Technometrics, 16, 125-127. doi:10.1080/00401706.1974.10489157.

Berk, R. (2008). Statistical Learning from a Regression Perspective. Springer.

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

See Also

The ridge model fitting lmridge, ridge CV and GCV plots cv.plot, variance bias trade-off plot bias.plot, m-scale and isrm plots isrm.plot, ridge AIC and BIC plots info.plot, ridge and VIF trace plot.lmridge

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.2, 0.005))
rplots.plot(mod)
rplots.plot(mod, abline = FALSE)

Ordinary Ridge Regression Statistics 1

Description

The rstats1 function computes the ordinary ridge related statistics such as variance, squared bias, MSE, R-squared and condition number (CN), etc. (see Lee, 1979; Kalivas and Palmer, 2014 <doi:10.1002/cem.2555>)

Usage

rstats1(x, ...)
   ## S3 method for class 'lmridge'
rstats1(x, ...)
   ## S3 method for class 'rstats1'
print(x, digits = max(5,getOption("digits") - 5), ...)

Arguments

x

An object of class "lmridge" (for the rstats1 or print.rstats1.lmridge)

digits

Minimum number of significant digits to be used for most numbers.

...

Not presently used in this implementation.

Details

The rstats1 function computes the ordinary ridge regression related statistics which may help in selecting optimal value of biasing parameter KK. If value of KK is zero then these statistics are equivalent to the relevant OLS statistics.

Value

Following are the ridge related statistics computed for given scalar or vector value of biasing parameter KK provided as argument to lmridge or lmridgeEst function.

var

Variance of ridge regression for given biasing parameter KK.

bias2

Squared bias of ridge regression for given biasing parameter KK.

mse

Total MSE value for given biasing parameter KK.

Fv

F-statistics value for testing of the significance of the ordinary ridge regression estimator computed for given biasing parameter KK.

rfact

Shrinkage factor λjλj+K\frac{\lambda_j}{\lambda_j+K} for given biasing parameter KK.

R2

R-squared for given biasing parameter KK.

adjR2

Adjusted R-squared for given biasing parameter KK.

eigval

Eigenvalue of XXX'X matrix for K=0K=0.

CN

Condition number after addition of biasing parameter in XXX'X matrix.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Cule, E. and De lorio, M. (2012). A semi-Automatic method to guide the choice of ridge parameter in ridge regression. arXiv:1205.0686v1 [stat.AP].

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Kalivas, J. H., and Palmer, J. (2014). Characterizing multivariate calibration tradeoffs (bias, variance, selectivity, and sensitivity) to select model tuning parameters. Journal of Chemometrics, 28(5), 347–357. doi:10.1002/cem.2555.

See Also

Ridge related statistics rstats2, the ridge model fitting lmridge, ridge var-cov matrix vcov

Examples

data(Hald)
mod <- lmridge(y~., data = as.data.frame(Hald), K = seq(0,0.2, 0.005) )

rstats1(mod)

## Getting only Ridge MSE
rstats1(mod)[3]

rstats1(mod)$mse

Ordinary Ridge Regression Statistics 2

Description

The rstats2 function computes the ordinary ridge related statistics such as CkCk, σ2\sigma^2, ridge degrees of freedom, effective degrees of freedom (EDF), and prediction residual error sum of squares PRESS statistics for scalar or vector value of biasing parameter KK (See Allen, 1974 <doi:10.2307/1267500>; Lee, 1979; Hoerl and Kennard, 1970 <doi:10.2307/1267351>).

Usage

rstats2(x, ...)
   ## S3 method for class 'lmridge'
rstats2(x, ...)
   ## S3 method for class 'rstats2'
print(x, digits = max(5,getOption("digits") - 5), ...)

Arguments

x

For the rstats2 method, an object of class "lmridge", i.e., a fitted model.

digits

Minimum number of significant digits to be used.

...

Not presently used in this implementation.

Details

The rstats2 function computes the ridge regression related different statistics which may help in selecting the optimal value of biasing parameter KK. If value of KK is zero then these statistics are equivalent to the relevant OLS statistics.

Value

Following are ridge related statistics computed for given scalar or vector value of biasing parameter KK provided as argument to lmridge or lmridgeEst function.

CK

CkCk similar to Mallows CpCp statistics for given biasing parameter KK.

dfridge

DF of ridge for given biasing parameter KK, i.e., Trace[HatR,k]Trace[Hat_{R,k}].

EP

Effective number of Parameters for given biasing parameter KK, i.e., Trace[2HatR,kHatR,kt(HatR,k)]Trace[2Hat_{R,k}-Hat_{R,k}t(Hat_{R,k})].

redf

Residual effective degrees of freedom for given biasing parameter KK from Hastie and Tibshirani, (1990), i.e., nTrace[2HatR,kHatR,kt(HatR,k)]n-Trace[2Hat_{R,k}-Hat_{R,k}t(Hat_{R,k})].

EF

Effectiveness index for given biasing parameter KK, also called the ratio of reduction in total variance in the total squared bias by the ridge regression, i.e., EF=σ2trace(XX)1σ2trace(VIFR)Bias2(β^R)EF=\frac{\sigma^2 trace(X'X)^{-1}-\sigma^2 trace(VIF_R)}{Bias^2(\hat{\beta}_R)}.

ISRM

Quantification of concept of stable region proposed by Vinod and Ullah, 1981, i.e., ISRMk=j=1p(p(λjλj+k)2j=1pλj(λj+k)2λj1)2ISRM_k=\sum_{j=1}^p (\frac{p(\frac{\lambda_j}{\lambda_j+k})^2}{\sum_{j=1}^p \frac{\lambda_j}{(\lambda_j+k)^2} \lambda_j}-1)^2.

m

m-scale for given value of biasing parameter proposed by Vinod (1976) alternative to plotting of the ridge coefficients, i.e., pj1pλjλj+kp-\sum_{j-1}^p \frac{\lambda_j}{\lambda_j+k}.

PRESS

PRESS statistics for ridge regression introduced by Allen, 1971, 1974, i.e., PRESSk=i=1nei,i2PRESS_k=\sum_{i=1}^n e^2_{i,-i} for scalar or vector value of biasing parameter KK.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Allen, D. M. (1971). Mean Square Error of Prediction as a Criterion for Selecting Variables. Technometrics, 13, 469-475. doi:10.1080/00401706.1971.10488811.

Allen, D. M. (1974). The Relationship between Variable Selection and Data Augmentation and Method for Prediction. Technometrics, 16, 125-127. doi:10.1080/00401706.1974.10489157.

Cule, E. and De lorio, M. (2012). A semi-Automatic method to guide the choice of ridge parameter in ridge regression. arXiv:1205.0686v1 [stat.AP].

Hastie, T. and Tibshirani, R. (1990). Generalized Additive Models. Chapman & Hall.

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Kalivas, J. H., and Palmer, J. (2014). Characterizing Multivariate Calibration Tradeoffs (Bias, Variance, Selectivity, and Sensitivity) to Select Model Tuning Parameters. Journal of Chemometrics, 28(5), 347–357. doi:10.1002/cem.2555.

Lee, W. F. (1979). Model Estimation Using Ridge Regression with the Variane Normalization Criterion. Master thesis, Department of Educational Foundation Memorial University of Newfoundland.

See Also

Ridge related statistics rstats1, ridge model fitting lmridge

Examples

data(Hald)
mod <- lmridge(y~., data=as.data.frame(Hald), K = seq(0,0.2, 0.001) )

rstats2(mod)

Summarizing Linear Ridge Regression Fits

Description

The summary method for class "lmridge" for scalar or vector biasing parameter KK (Cule and De lorio, 2012).

Usage

## S3 method for class 'lmridge'
summary(object, ...)
## S3 method for class 'summary.lmridge'
print(x, digits = max(3, getOption("digits") - 3),
           signif.stars = getOption("show.signif.stars"), ...)

Arguments

object

An "lmridge" object, typically generated by a call to lmridge.

x

An object of class summary.lmridge for the print.summary.lmridge.

signif.stars

logical: if TRUE, p-values are additionally encoded visually as significance starts in order to help scanning of long coefficient tables. It default to the show.signif.stars slot of options.

digits

The number of significant digits to use when printing.

...

Not presently used in this implementation.

Details

print.summary.lmridge tries to be smart about formatting the coefficients, standard errors etc. and additionally gives 'significance stars' if signif.stars is TRUE.

Value

The function summary computes and returns a list of summary statistics of the fitted linear ridge regression model for scalar or vector value biasing parameter KK given as argument in lmridge function. All summary information can be called using list object summaries.

coefficients

A p×5p \times 5 matrix with columns for the scaled estimated, descaled estimated coefficients, scaled standard error, scaled t-statistics, and corresponding p-value (two-tailed). The Intercept term is computed by the relation β^R0K=yj=1pXjβ^R0K\hat{\beta}_{R_{0K}}=\overline{y}-\sum_{j=1}^{p}\overline{X}_j \hat{\beta}_{R_{0K}}. The standard error of intercept term is computed as, SE(β^R0K)=Var(y)+Xj2diag[Cov(β^R0K)]SE(\hat{\beta}_{R_{0K}})=\sqrt{Var(\overline{y}) +\overline{X}_j^2 diag[Cov(\hat{\beta}_{R_{0K}})]}.

stats

Ridge related statistics of R-squared, adjusted R-squared, F-statistics for testing of coefficients, AIC and BIC values for given biasing parameter KK.

rmse1

Minimum MSE value for given biasing parameter KK.

rmse2

Value of KK at which MSE is minimum.

K

Value of given biasing parameter.

df1

Numerator degrees of freedom for p-value of F-statistics.

df2

Denominator degrees of freedom for p-value of F-statistics.

fpvalue

p-value for each F-statistics.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Cule, E. and De lorio, M. (2012). A semi-Automatic method to guide the choice of ridge parameter in ridge regression. arXiv:1205.0686v1 [stat.AP].

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. doi:10.1080/03610927508827232.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. doi:10.1080/00401706.1970.10488634.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

See Also

The ridge model fitting lmridge

Examples

mod <- lmridge(y~., as.data.frame(Hald), K = c(0, 0.0132, 0.1))
summary(mod)

## coefficients for first biasing parameter
summary(mod)$summaries[[1]]$coefficients
summary(mod)$summaries[[1]][[1]]

## ridge related statistics from summary function
summary(mod)$summaries[[1]]$stats

## Ridge F-test's p-value
summary(mod)$summaries[[1]]$fpvalue

Variance-Covariance Matrix for Fitted Ridge Model

Description

The vcov function computes the variance-covariance matrix for the estimates of linear ridge regression model.

Usage

## S3 method for class 'lmridge'
vcov(object, ...)

Arguments

object

For VCOV method, an object of class "lmridge", i.e., a fitted model.

...

Not presently used in this implementation.

Details

The vcov function computes variance-covariance matrix for scalar or vector value of biasing parameter KK provided as argument to lmridge function.

Value

A list of matrix of estimated covariances in the linear ridge regression model for scalar or vector biasing parameter KK is produced. Each list element has row and column names corresponding to the parameter names given by the coef(mod). List items are named correspond to values of biasing parameter KK.

Note

Covariance will be without intercept term, as intercept term is not penalized in ridge regression.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Brown, G.W. and Beattie, B.R. (1975). Improving Estimates of Economic Parameters by use of Ridge Regression with Production Function Applications. American Journal of Agricultural Economics, 57(1), 21-32. doi:10.2307/1238836.

See Also

The ridge model fitting lmridge, ridge VIF values vif

Examples

data(Hald)
mod<- lmridge(y~., data=as.data.frame(Hald), scaling="sc", K=seq(0,1,.2) )

vcov.lmridge(mod)
vcov(mod)

Variance Inflation Fator for Linear Ridge Regression

Description

Computes VIF values for each scalar or vector value of biasing parameter KK (Marquardt, 1970).

Usage

vif(x, ...)
## S3 method for class 'lmridge'
vif(x, ...)

Arguments

x

For VIF method, an object of class "lmridge", i.e., a fitted model.

...

Not presently used in this implementation.

Details

The vif.lmridge function computes VIF value for each regressor in data set after addition of biasing parameter as argument to lmridge function. The VIF is computed using (XX+kI)1XX(XX+kI)1(X'X+kI)^{-1}X'X(X'X+kI)^{-1}, given by Marquardt, (1970).

Value

The vif function returns a matrix of VIF values for each regressor after adding scalar or vector biasing parameter KK to XXX'X matrix. The column of returned matrix indicates regressors name and row indicates value of each biasing parameter KK provided as argument to lmridge function.

Author(s)

Muhammad Imdad Ullah, Muhammad Aslam

References

Fox, J. and Monette, G. (1992). Generalized Collinearity Diagnostics. JASA, 87, 178–183.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Marquardt, D. (1970). Generalized Inverses, Ridge Regression, Biased Linear Estimation, and Nonlinear Estimation. Technometrics, 12(3), 591–612.

See Also

The ridge model fitting lmridge, ridge Var-Cov matrix vcov

Examples

data(Hald)
mod <- lmridge(y~., data = as.data.frame(Hald), scaling = "sc", K = seq(0,1,.2) )
vif(mod)