Please use this identifier to cite or link to this item:
https://doi.org/10.1007/s11222-011-9279-3
Title: | The predictive Lasso | Authors: | Tran, M.-N. Nott, D.J. Leng, C. |
Keywords: | Generalized linear models Kullback-Leibler divergence Lasso Optimal prediction Variable selection |
Issue Date: | Sep-2012 | Citation: | Tran, M.-N., Nott, D.J., Leng, C. (2012-09). The predictive Lasso. Statistics and Computing 22 (5) : 1069-1084. ScholarBank@NUS Repository. https://doi.org/10.1007/s11222-011-9279-3 | Abstract: | We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized linear models (GLMs) with an explicit predictive motivation. The procedure estimates the coefficients by minimizing the Kullback-Leibler divergence of a set of predictive distributions to the corresponding predictive distributions for the full model, subject to an l 1 constraint on the coefficient vector. This results in selection of a parsimonious model with similar predictive performance to the full model. Thanks to its similar form to the original Lasso problem for GLMs, our procedure can benefit from available l 1-regularization path algorithms. Simulation studies and real data examples confirm the efficiency of our method in terms of predictive performance on future observations. © 2011 Springer Science+Business Media, LLC. | Source Title: | Statistics and Computing | URI: | http://scholarbank.nus.edu.sg/handle/10635/105428 | ISSN: | 09603174 | DOI: | 10.1007/s11222-011-9279-3 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
SCOPUSTM
Citations
9
checked on Mar 22, 2023
WEB OF SCIENCETM
Citations
6
checked on Mar 22, 2023
Page view(s)
177
checked on Mar 16, 2023
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.