Please use this identifier to cite or link to this item:
|Title:||Least squares approximation with a diverging number of parameters|
|Authors:||Leng, C. |
|Source:||Leng, C., Li, B. (2010). Least squares approximation with a diverging number of parameters. Statistics and Probability Letters 80 (3-4) : 254-261. ScholarBank@NUS Repository. https://doi.org/10.1016/j.spl.2009.10.015|
|Abstract:||Regularized regression with the ℓ1 penalty is a popular approach for variable selection and coefficient estimation. For a unified treatment of the ℓ1-constrained model selection, Wang and Leng (2007) proposed the least squares approximation method (LSA) for a fixed dimension. LSA makes use of a quadratic expansion of the loss function and takes full advantage of the fast Lasso algorithm in Efron et al. (2004). In this paper, we extend the fixed dimension LSA to the situation with a diverging number of parameters. We show that LSA possesses the oracle properties under appropriate conditions when the number of variables grows with the sample size. We propose a new tuning parameter selection method which achieves the oracle properties. Extensive simulation studies confirmed the theoretical results. © 2009 Elsevier B.V. All rights reserved.|
|Source Title:||Statistics and Probability Letters|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Jan 15, 2018
WEB OF SCIENCETM
checked on Nov 22, 2017
checked on Jan 12, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.