Please use this identifier to cite or link to this item:
https://doi.org/10.1109/TIP.2011.2180916
Title: | Semantic-gap-oriented active learning for multilabel image annotation | Authors: | Tang, J. Zha, Z.-J. Tao, D. Chua, T.-S. |
Keywords: | Active learning image annotation multilabel semantic gap sparse graph |
Issue Date: | 2012 | Citation: | Tang, J., Zha, Z.-J., Tao, D., Chua, T.-S. (2012). Semantic-gap-oriented active learning for multilabel image annotation. IEEE Transactions on Image Processing 21 (4) : 2354-2360. ScholarBank@NUS Repository. https://doi.org/10.1109/TIP.2011.2180916 | Abstract: | User interaction is an effective way to handle the semantic gap problem in image annotation. To minimize user effort in the interactions, many active learning methods were proposed. These methods treat the semantic concepts individually or correlatively. However, they still neglect the key motivation of user feedback: to tackle the semantic gap. The size of the semantic gap of each concept is an important factor that affects the performance of user feedback. User should pay more efforts to the concepts with large semantic gaps, and vice versa. In this paper, we propose a semantic-gap-oriented active learning method, which incorporates the semantic gap measure into the information-minimization- based sample selection strategy. The basic learning model used in the active learning framework is an extended multilabel version of the sparse-graph-based semisupervised learning method that incorporates the semantic correlation. Extensive experiments conducted on two benchmark image data sets demonstrated the importance of bringing the semantic gap measure into the active learning process. © 2011 IEEE. | Source Title: | IEEE Transactions on Image Processing | URI: | http://scholarbank.nus.edu.sg/handle/10635/39557 | ISSN: | 10577149 | DOI: | 10.1109/TIP.2011.2180916 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.