Please use this identifier to cite or link to this item:
Title: KGAT: Knowledge Graph Attention Network for Recommendation
Authors: Xiang Wang 
Xiangnan He 
Yixin Cao 
Meng Liu
Tat-Seng Chua 
Keywords: Collaborative Filtering
Embedding Propagation
Graph Neural Network
Higher-order Connectivity
Knowledge Graph
Issue Date: 4-Aug-2019
Publisher: Association for Computing Machinery
Citation: Xiang Wang, Xiangnan He, Yixin Cao, Meng Liu, Tat-Seng Chua (2019-08-04). KGAT: Knowledge Graph Attention Network for Recommendation. KDD 2019 : 950-958. ScholarBank@NUS Repository.
Abstract: To provide more accurate, diverse, and explainable recommendation, it is compulsory to go beyond modeling user-item interactions and take side information into account. Traditional methods like factorization machine (FM) cast it as a supervised learning problem, which assumes each interaction as an independent instance with side information encoded. Due to the overlook of the relations among instances or items (e.g., the director of a movie is also an actor of another movie), these methods are insufficient to distill the collaborative signal from the collective behaviors of users. In this work, we investigate the utility of knowledge graph (KG), which breaks down the independent interaction assumption by linking items with their attributes. We argue that in such a hybrid structure of KG and user-item graph, high-order relations - which connect two items with one or multiple linked attributes - are an essential factor for successful recommendation. We propose a new method named Knowledge Graph Attention Network (KGAT) which explicitly models the high-order connectivities in KG in an end-to-end fashion. It recursively propagates the embeddings from a node's neighbors (which can be users, items, or attributes) to refine the node's embedding, and employs an attention mechanism to discriminate the importance of the neighbors. Our KGAT is conceptually advantageous to existing KG-based recommendation methods, which either exploit high-order relations by extracting paths or implicitly modeling them with regularization. Empirical results on three public benchmarks show that KGAT significantly outperforms state-of-the-art methods like Neural FM [11] and RippleNet [29]. Further studies verify the efficacy of embedding propagation for high-order relation modeling and the interpretability benefits brought by the attention mechanism. We release the codes and datasets at © 2019 Association for Computing Machinery.
Source Title: KDD 2019
ISBN: 9781450362016
DOI: 10.1145/3292500.3330989
Appears in Collections:Elements
Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
KGAT.pdf1.33 MBAdobe PDF



Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.