Selection of Tuning Parameter and Comparison of Lasso and Adaptive Lasso on ZYZ Condition: A Monte Carlo Study
DOI:
https://doi.org/10.62345/jads.2024.13.3.21Keywords:
Lasso Regression, Adaptive Lasso Regression, Ridge Regression, Cross ValidationAbstract
One of the fundamental objectives of statistics is to achieve accurate predictions. In high-dimensional settings (where the number of variables, p, exceeds the number of observations, n), the performance of ordinary least squares (OLS) is often suboptimal due to its high variance, which leads to lower prediction accuracy. Shrinking the variables is a promising approach, and methods such as ridge regression, elastic net, lasso, and adaptive lasso are well-known techniques for this purpose. While variable shrinkage introduces a small bias, it significantly reduces the variance compared to OLS. The effectiveness of shrinkage methods largely depends on the selection of the tuning parameter. Cross-validation and the Bayesian Information Criterion (BIC) are commonly used for this purpose, and an improved version of BIC has shown impressive results.???????? = (5.6,5.6,5.6,0), ???????? = (3,1.5,0,0,2,0,0,0), ???????? = (0.85,0.85,0.85,0) are the multiple regression models which are compared.
Downloads
Downloads
Published
Issue
Section
License
This work is licensed under a Creative Commons Attribution 4.0 International License.
License Terms
All articles published by Centre for Research on Poverty and Attitude are made immediately available worldwide under an open access license. This means:
- everyone has free and unlimited access to the full-text of all articles published in Centre for Research on Poverty and Attitude's journals;
- everyone is free to re-use the published material if proper accreditation/citation of the original publication is given.