Please use this identifier to cite or link to this item: https://doi.org/10.1109/WACV.2016.7477616
Title: Toward correlating and solving abstract tasks using convolutional neural networks
Authors: Peng K.-C.
Chen T. 
Issue Date: 2016
Publisher: Institute of Electrical and Electronics Engineers Inc.
Citation: Peng K.-C., Chen T. (2016). Toward correlating and solving abstract tasks using convolutional neural networks. 2016 IEEE Winter Conference on Applications of Computer Vision, WACV 2016 : 7477616. ScholarBank@NUS Repository. https://doi.org/10.1109/WACV.2016.7477616
Abstract: Most works using convolutional neural networks (CNN) show the efficacy of their methods in standard object recognition tasks, but not in abstract tasks such as emotion classification and memorability prediction, which are a subject of increasing importance (especially as machines become more autonomous, there is a need for semantic understanding). To verify whether CNN-based methods are effective in abstract tasks, we select 8 different abstract tasks in computer vision, evaluating the performance of 5 different CNN-based training approaches in these tasks. We show that CNN-based approaches outperform the state-of-the-art results in all the 8 tasks. Furthermore, we show that concatenating CNN features learned from different tasks can enhance the performance in each task. We also show that concatenating the CNN features learned from all the tasks under experiment does not perform the best, which is different from what is usually shown in previous works. Using CNN as a tool to correlate different tasks, we suggest which CNN features researchers should use in each task.
Source Title: 2016 IEEE Winter Conference on Applications of Computer Vision, WACV 2016
URI: http://scholarbank.nus.edu.sg/handle/10635/146068
ISBN: 9781509006410
DOI: 10.1109/WACV.2016.7477616
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.