Please use this identifier to cite or link to this item:
Title: GraphTCN: Spatio-temporal interaction modeling for human trajectory prediction
Authors: Wang, C 
Cai, S
Tan, G 
Keywords: cs.CV
Issue Date: 1-Jan-2021
Publisher: IEEE
Citation: Wang, C, Cai, S, Tan, G (2021-01-01). GraphTCN: Spatio-temporal interaction modeling for human trajectory prediction. Proceedings - 2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021 abs/2003.07167 : 3449-3458. ScholarBank@NUS Repository.
Abstract: Predicting the future paths of an agent's neighbors accurately and in a timely manner is central to the autonomous applications for collision avoidance. Conventional approaches, e.g., LSTM-based models, take considerable computational costs in the prediction, especially for the long sequence prediction. To support more efficient and accurate trajectory predictions, we propose a novel CNN-based spatial-temporal graph framework GraphTCN, which models the spatial interactions as social graphs and captures the spatio-temporal interactions with a modified temporal convolutional network. In contrast to conventional models, both the spatial and temporal modeling of our model are computed within each local time window. Therefore, it can be executed in parallel for much higher efficiency, and meanwhile with accuracy comparable to best-performing approaches. Experimental results confirm that our model achieves better performance in terms of both efficiency and accuracy as compared with state-of-the-art models on various trajectory prediction benchmark datasets.
Source Title: Proceedings - 2021 IEEE Winter Conference on Applications of Computer Vision, WACV 2021
DOI: 10.1109/WACV48630.2021.00349
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
2003.07167v6.pdf7.52 MBAdobe PDF




checked on Oct 1, 2022

Page view(s)

checked on Sep 29, 2022


checked on Sep 29, 2022

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.