Please use this identifier to cite or link to this item:
https://doi.org/10.1109/TNNLS.2018.2890117
Title: | Fast Matrix Factorization With Nonuniform Weights on Missing Data | Authors: | Xiangnan He Jinhui Tang Xiaoyu Du Richang Hong Tongwei Ren Tat-Seng Chua |
Keywords: | Matrix Factorization Missing Data Element-wise Alternating Least Squares (eALS) Recommendation System |
Issue Date: | 23-Jan-2019 | Publisher: | Institute of Electrical and Electronics Engineers Inc. | Citation: | Xiangnan He, Jinhui Tang, Xiaoyu Du, Richang Hong, Tongwei Ren, Tat-Seng Chua (2019-01-23). Fast Matrix Factorization With Nonuniform Weights on Missing Data. IEEE Transactions on Neural Networks and Learning Systems. ScholarBank@NUS Repository. https://doi.org/10.1109/TNNLS.2018.2890117 | Abstract: | Matrix factorization (MF) has been widely used to discover the low-rank structure and to predict the missing entries of data matrix. In many real-world learning systems, the data matrix can be very high dimensional but sparse. This poses an imbalanced learning problem since the scale of missing entries is usually much larger than that of the observed entries, but they cannot be ignored due to the valuable negative signal. For efficiency concern, existing work typically applies a uniform weight on missing entries to allow a fast learning algorithm. However, this simplification will decrease modeling fidelity, resulting in suboptimal performance for downstream applications. In this paper, we weight the missing data nonuniformly, and more generically, we allow any weighting strategy on the missing data. To address the efficiency challenge, we propose a fast learning method, for which the time complexity is determined by the number of observed entries in the data matrix rather than the matrix size. The key idea is twofold: 1) we apply truncated singular value decomposition on the weight matrix to get a more compact representation of the weights and 2) we learn MF parameters with elementwise alternating least squares (eALS) and memorize the key intermediate variables to avoid repeating computations that are unnecessary. We conduct extensive experiments on two recommendation benchmarks, demonstrating the correctness, efficiency, and effectiveness of our fast eALS method. IEEE | Source Title: | IEEE Transactions on Neural Networks and Learning Systems | URI: | https://scholarbank.nus.edu.sg/handle/10635/168411 | ISSN: | 2162237X | DOI: | 10.1109/TNNLS.2018.2890117 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.