Please use this identifier to cite or link to this item:
|Title:||Shrinkage tuning parameter selection with a diverging number of parameters||Authors:||Wang, H.
|Keywords:||Bayesian information criterion
Diverging number of parameters
Smoothly clipped absolute deviation
|Issue Date:||Jun-2009||Citation:||Wang, H., Li, B., Leng, C. (2009-06). Shrinkage tuning parameter selection with a diverging number of parameters. Journal of the Royal Statistical Society. Series B: Statistical Methodology 71 (3) : 671-683. ScholarBank@NUS Repository. https://doi.org/10.1111/j.1467-9868.2008.00693.x||Abstract:||Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g. the lasso and smoothly clipped absolute deviation) are found to be particularly useful for variable selection. Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang and co-worker have demonstrated that the tuning parameters selected by a Bayesian information criterion type criterion can identify the true model consistently. In this work, similar results are further extended to the situation with a diverging number of parameters for both unpenalized and penalized estimators. Consequently, our theoretical results further enlarge not only the scope of applicabilityation criterion type criteria but also that of those shrinkage estimation methods. © 2008 Royal Statistical Society.||Source Title:||Journal of the Royal Statistical Society. Series B: Statistical Methodology||URI:||http://scholarbank.nus.edu.sg/handle/10635/105363||ISSN:||13697412||DOI:||10.1111/j.1467-9868.2008.00693.x|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.