Please use this identifier to cite or link to this item:
|dc.title||Complex event detection via multi-source video attributes|
|dc.identifier.citation||Ma, Z., Yang, Y., Xu, Z., Yan, S., Sebe, N., Hauptmann, A.G. (2013). Complex event detection via multi-source video attributes. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition : 2627-2633. ScholarBank@NUS Repository. https://doi.org/10.1109/CVPR.2013.339|
|dc.description.abstract||Complex events essentially include human, scenes, objects and actions that can be summarized by visual attributes, so leveraging relevant attributes properly could be helpful for event detection. Many works have exploited attributes at image level for various applications. However, attributes at image level are possibly insufficient for complex event detection in videos due to their limited capability in characterizing the dynamic properties of video data. Hence, we propose to leverage attributes at video level (named as video attributes in this work), i.e., the semantic labels of external videos are used as attributes. Compared to complex event videos, these external videos contain simple contents such as objects, scenes and actions which are the basic elements of complex events. Specifically, building upon a correlation vector which correlates the attributes and the complex event, we incorporate video attributes latently as extra informative cues into the event detector learnt from complex event videos. Extensive experiments on a real-world large-scale dataset validate the efficacy of the proposed approach. © 2013 IEEE.|
|dc.contributor.department||ELECTRICAL & COMPUTER ENGINEERING|
|dc.description.sourcetitle||Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition|
|Appears in Collections:||Staff Publications|
Show simple item record
Files in This Item:
There are no files associated with this item.
checked on Nov 30, 2019
WEB OF SCIENCETM
checked on Nov 22, 2019
checked on Dec 1, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.