Please use this identifier to cite or link to this item:
https://doi.org/10.1198/016214508000000418
Title: | Sliced regression for dimension reduction | Authors: | Wang, H. Xia, Y. |
Keywords: | Cross-validation Earnings forecast Minimum average variance estimation Sliced inverse regression Sufficient dimension reduction |
Issue Date: | Jun-2008 | Citation: | Wang, H., Xia, Y. (2008-06). Sliced regression for dimension reduction. Journal of the American Statistical Association 103 (482) : 811-821. ScholarBank@NUS Repository. https://doi.org/10.1198/016214508000000418 | Abstract: | A new dimension-reduction method involving slicing the region of the response and applying local kernel regression to each slice is proposed. Compared with the traditional inverse regression methods [e.g., sliced inverse regression (SIR)], the new method is free of the linearity condition and has much better estimation accuracy. Compared with the direct estimation methods (e.g., MAVE), the new method is much more robust against extreme values and can capture the entire central subspace (CS) exhaustively. To determine the CS dimension, a consistent cross-validation criterion is developed. Extensive numerical studies, including a real example, confirm our theoretical findings. © 2008 American Statistical Association. | Source Title: | Journal of the American Statistical Association | URI: | http://scholarbank.nus.edu.sg/handle/10635/105372 | ISSN: | 01621459 | DOI: | 10.1198/016214508000000418 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.