Please use this identifier to cite or link to this item:
Title: Temporal Spiking Recurrent Neural Network for Action Recognition
Authors: Wang, W.
Hao, S.
Wei, Y.
Xiao, S.
Feng, J. 
Sebe, N.
Keywords: Action recognition
recurrent neural network
temporal spiking
Issue Date: 2019
Publisher: Institute of Electrical and Electronics Engineers Inc.
Citation: Wang, W., Hao, S., Wei, Y., Xiao, S., Feng, J., Sebe, N. (2019). Temporal Spiking Recurrent Neural Network for Action Recognition. IEEE Access 7 : 117165-117175. ScholarBank@NUS Repository.
Rights: Attribution-NonCommercial-NoDerivatives 4.0 International
Abstract: In this paper, we propose a novel temporal spiking recurrent neural network (TSRNN) to perform robust action recognition in videos. The proposed TSRNN employs a novel spiking architecture which utilizes the local discriminative features from high-confidence reliable frames as spiking signals. The conventional CNN-RNNs typically used for this problem treat all the frames equally important such that they are error-prone to noisy frames. The TSRNN solves this problem by employing a temporal pooling architecture which can help RNN select sparse and reliable frames and enhances its capability in modelling long-range temporal information. Besides, a message passing bridge is added between the spiking signals and the recurrent unit. In this way, the spiking signals can guide RNN to correct its long-term memory across multiple frames from contamination caused by noisy frames with distracting factors (e.g., occlusion, rapid scene transition). With these two novel components, TSRNN achieves competitive performance compared with the state-of-the-art CNN-RNN architectures on two large scale public benchmarks, UCF101 and HMDB51. © 2013 IEEE.
Source Title: IEEE Access
ISSN: 21693536
DOI: 10.1109/ACCESS.2019.2936604
Rights: Attribution-NonCommercial-NoDerivatives 4.0 International
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
10_1109_ACCESS_2019_2936604.pdf2.06 MBAdobe PDF




checked on Oct 1, 2022

Page view(s)

checked on Sep 29, 2022

Google ScholarTM



This item is licensed under a Creative Commons License Creative Commons