Please use this identifier to cite or link to this item: https://doi.org/10.1007/978-3-540-70517-8_17
Title: A two-step approach for detecting individuals within dense crowds
Authors: Sim, C.-H.
Rajmadhan, E.
Ranganath, S. 
Issue Date: 2008
Citation: Sim, C.-H.,Rajmadhan, E.,Ranganath, S. (2008). A two-step approach for detecting individuals within dense crowds. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 5098 LNCS : 166-174. ScholarBank@NUS Repository. https://doi.org/10.1007/978-3-540-70517-8_17
Abstract: This paper proposes a two-step approach for detecting individuals within dense crowds. First step uses an offline-trained Viola-type head detector in still color images of dense crowds in a cluttered background. In the second step, which aims to reduce false alarm rates at same detection rates, color bin images are constructed from normalized rg color histograms of the detected windows in the first step. Haar-like features extracted from these color bin images are input to a trained cascade of boosted classifiers to separate correct detections from false alarms. Experimental results of both steps are presented as Receiver Operating Characteristics (ROC) curves, in comparison with recent related work. Our proposed two-step approach is able to attain a high detection rate of 90.0%, while maintaining false alarm rate below 40.0%, as compared to other work which attains a high 70.0% false alarm rate when detection rate is still below 90.0%. © 2008 Springer-Verlag.
Source Title: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
URI: http://scholarbank.nus.edu.sg/handle/10635/69115
ISBN: 3540705163
ISSN: 03029743
DOI: 10.1007/978-3-540-70517-8_17
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.