Please use this identifier to cite or link to this item:
https://doi.org/10.1145/2911451.2911489
DC Field | Value | |
---|---|---|
dc.title | Fast Matrix Factorization for Online Recommendation with Implicit Feedback | |
dc.contributor.author | He, Xiangnan | |
dc.contributor.author | Zhang, Hanwang | |
dc.contributor.author | Kan, Min-Yen | |
dc.contributor.author | Chua, Tat-Seng | |
dc.date.accessioned | 2022-07-29T02:41:35Z | |
dc.date.available | 2022-07-29T02:41:35Z | |
dc.date.issued | 2016-01-01 | |
dc.identifier.citation | He, Xiangnan, Zhang, Hanwang, Kan, Min-Yen, Chua, Tat-Seng (2016-01-01). Fast Matrix Factorization for Online Recommendation with Implicit Feedback. 39th International ACM SIGIR conference on Research and Development in Information Retrieval abs/1708.05024 : 549-558. ScholarBank@NUS Repository. https://doi.org/10.1145/2911451.2911489 | |
dc.identifier.isbn | 9781450342902 | |
dc.identifier.uri | https://scholarbank.nus.edu.sg/handle/10635/229395 | |
dc.description.abstract | This paper contributes improvements on both the effectiveness and efficiency of Matrix Factorization (MF) methods for implicit feedback. We highlight two critical issues of existing works. First, due to the large space of unobserved feedback, most existing works resort to assign a uniform weight to the missing data to reduce computational complexity. However, such a uniform assumption is invalid in real-world settings. Second, most methods are also designed in an offline setting and fail to keep up with the dynamic nature of online data. We address the above two issues in learning MF models from implicit feedback. We first propose to weight the missing data based on item popularity, which is more effective and flexible than the uniform-weight assumption. However, such a non-uniform weighting poses efficiency challenge in learning the model. To address this, we specifically design a new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique, for efficiently optimizing a MF model with variably-weighted missing data. We exploit this efficiency to then seamlessly devise an incremental update strategy that instantly refreshes a MF model given new feedback. Through comprehensive experiments on two public datasets in both offline and online protocols, we show that our eALS method consistently outperforms state-of-the-art implicit MF methods. Our implementation is available at https://github.com/hexiangnan/sigir16-eals. | |
dc.publisher | ASSOC COMPUTING MACHINERY | |
dc.source | Elements | |
dc.subject | Science & Technology | |
dc.subject | Technology | |
dc.subject | Computer Science, Information Systems | |
dc.subject | Computer Science | |
dc.subject | Matrix Factorization | |
dc.subject | Implicit Feedback | |
dc.subject | Item Recommendation | |
dc.subject | Online Learning | |
dc.subject | ALS | |
dc.subject | Coordinate Descent | |
dc.type | Conference Paper | |
dc.date.updated | 2022-07-19T07:55:50Z | |
dc.contributor.department | DEPARTMENT OF COMPUTER SCIENCE | |
dc.description.doi | 10.1145/2911451.2911489 | |
dc.description.sourcetitle | 39th International ACM SIGIR conference on Research and Development in Information Retrieval | |
dc.description.volume | abs/1708.05024 | |
dc.description.page | 549-558 | |
dc.published.state | Published | |
Appears in Collections: | Staff Publications Elements |
Show simple item record
Files in This Item:
File | Description | Size | Format | Access Settings | Version | |
---|---|---|---|---|---|---|
1708.05024v1.pdf | 1.49 MB | Adobe PDF | OPEN | Published | View/Download |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.