Please use this identifier to cite or link to this item: https://doi.org/10.1016/j.ipm.2019.102076
Title: HoAFM: A High-order Attentive Factorization Machine for CTR Prediction
Authors: Zhulin Tao
Xiang Wang 
Xiangnan He 
Xianglin Huang
Tat-Seng Chua 
Keywords: Factorization machines
High-order feature interactions
Attention mechanism
Deep neural network
Issue Date: 2-Jul-2019
Publisher: Elsevier Ltd
Citation: Zhulin Tao, Xiang Wang, Xiangnan He, Xianglin Huang, Tat-Seng Chua (2019-07-02). HoAFM: A High-order Attentive Factorization Machine for CTR Prediction. Information Processing & Management 57 (6) : 102076. ScholarBank@NUS Repository. https://doi.org/10.1016/j.ipm.2019.102076
Abstract: Modeling feature interactions is of crucial importance to predict click-through rate (CTR) in industrial recommender systems. However, manually crafting cross features usually requires extensive domain knowledge and labor-intensive feature engineering to obtain the desired cross features. To alleviate this problem, the factorization machine (FM) is proposed to model feature interactions from raw features automatically. In particular, it embeds each feature in a vector representation and discovers second-order interactions as the product of two feature representations. In order to learn nonlinear and complex patterns, recent works, such as NFM, PIN, and DeepFM, exploited deep learning techniques to capture higher-order feature interactions. These approaches lack guarantees about the effectiveness of high-order pattern as they model feature interactions in a rather implicit way. To address this limitation, xDeepFM is recently proposed to generate high-order interactions of features in an explicit fashion, where multiple interaction networks are stacked. Nevertheless, xDeepFM suffers from rather high complexity which easily leads to overfitting. In this paper, we develop a more expressive but lightweight solution based on FM, named High-order Attentive Factorization Machine (HoAFM), by accounting for the higher-order sparse feature interactions in an explicit manner. Beyond the linearity of FM, we devise a cross interaction layer, which updates a feature's representation by aggregating the representations of other co-occurred features. In addition, we perform a bit-wise attention mechanism to determine the different importance of co-occurred features on the granularity of dimensions. By stacking multiple cross interaction layers, we can inject high-order feature interactions into feature representation learning, in order to establish expressive and informative cross features. Extensive experiments are performed on two benchmark datasets, Criteo and Avazu, to demonstrate the rationality and effectiveness of HoAFM. Empirical results suggest that HoAFM achieves significant improvement over other state-of-the-art methods, such as NFM and xDeepFM. We will make the codes public upon acceptance of this paper. © 2019 Elsevier Ltd
Source Title: Information Processing & Management
URI: https://scholarbank.nus.edu.sg/handle/10635/168417
ISSN: 3064573
DOI: 10.1016/j.ipm.2019.102076
Appears in Collections:Elements
Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
HoAFM A High-order Attentive Factorization Machine for CTR Prediction.pdf1.61 MBAdobe PDF

OPEN

NoneView/Download

SCOPUSTM   
Citations

6
checked on Jun 7, 2021

Page view(s)

101
checked on Jun 11, 2021

Download(s)

2
checked on Jun 11, 2021

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.