Please use this identifier to cite or link to this item: https://doi.org/10.1145/2911451.2911489
DC FieldValue
dc.titleFast Matrix Factorization for Online Recommendation with Implicit Feedback
dc.contributor.authorHe, Xiangnan
dc.contributor.authorZhang, Hanwang
dc.contributor.authorKan, Min-Yen
dc.contributor.authorChua, Tat-Seng
dc.date.accessioned2022-07-29T02:41:35Z
dc.date.available2022-07-29T02:41:35Z
dc.date.issued2016-01-01
dc.identifier.citationHe, Xiangnan, Zhang, Hanwang, Kan, Min-Yen, Chua, Tat-Seng (2016-01-01). Fast Matrix Factorization for Online Recommendation with Implicit Feedback. 39th International ACM SIGIR conference on Research and Development in Information Retrieval abs/1708.05024 : 549-558. ScholarBank@NUS Repository. https://doi.org/10.1145/2911451.2911489
dc.identifier.isbn9781450342902
dc.identifier.urihttps://scholarbank.nus.edu.sg/handle/10635/229395
dc.description.abstractThis paper contributes improvements on both the effectiveness and efficiency of Matrix Factorization (MF) methods for implicit feedback. We highlight two critical issues of existing works. First, due to the large space of unobserved feedback, most existing works resort to assign a uniform weight to the missing data to reduce computational complexity. However, such a uniform assumption is invalid in real-world settings. Second, most methods are also designed in an offline setting and fail to keep up with the dynamic nature of online data. We address the above two issues in learning MF models from implicit feedback. We first propose to weight the missing data based on item popularity, which is more effective and flexible than the uniform-weight assumption. However, such a non-uniform weighting poses efficiency challenge in learning the model. To address this, we specifically design a new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique, for efficiently optimizing a MF model with variably-weighted missing data. We exploit this efficiency to then seamlessly devise an incremental update strategy that instantly refreshes a MF model given new feedback. Through comprehensive experiments on two public datasets in both offline and online protocols, we show that our eALS method consistently outperforms state-of-the-art implicit MF methods. Our implementation is available at https://github.com/hexiangnan/sigir16-eals.
dc.publisherASSOC COMPUTING MACHINERY
dc.sourceElements
dc.subjectScience & Technology
dc.subjectTechnology
dc.subjectComputer Science, Information Systems
dc.subjectComputer Science
dc.subjectMatrix Factorization
dc.subjectImplicit Feedback
dc.subjectItem Recommendation
dc.subjectOnline Learning
dc.subjectALS
dc.subjectCoordinate Descent
dc.typeConference Paper
dc.date.updated2022-07-19T07:55:50Z
dc.contributor.departmentDEPARTMENT OF COMPUTER SCIENCE
dc.description.doi10.1145/2911451.2911489
dc.description.sourcetitle39th International ACM SIGIR conference on Research and Development in Information Retrieval
dc.description.volumeabs/1708.05024
dc.description.page549-558
dc.published.statePublished
Appears in Collections:Staff Publications
Elements

Show simple item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
1708.05024v1.pdf1.49 MBAdobe PDF

OPEN

PublishedView/Download

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.