Please use this identifier to cite or link to this item:
|Title:||Experiential sampling based foreground/background segmentation for video surveillance|
|Source:||Atrey, P.K., Kumar, V., Kumar, A., Kankanhalli, M.S. (2006). Experiential sampling based foreground/background segmentation for video surveillance. 2006 IEEE International Conference on Multimedia and Expo, ICME 2006 - Proceedings 2006 : 1809-1812. ScholarBank@NUS Repository. https://doi.org/10.1109/ICME.2006.262904|
|Abstract:||Segmentation of foreground and background has been an important research problem arising out of many applications including video surveillance. A method commonly used for segmentation is "background subtraction" or thresholding the difference between the estimated background image and current image. Adaptive Gaussian mixture based background modelling has been proposed by many researchers for increasing the robustness against environmental changes. However, all these methods, being computationally intensive, need to be optimized for efficient and real-time performance especially at a higher image resolution. In this paper, we propose an improved foreground/background segmentation method which uses Experiential Sampling technique to restrict the computational efforts in the region of interest. We exploit the fact that the region of interest in general is present only in a small part of the image, therefore, the attention should only be focused in those regions. The proposed method shows a significant gain in processing speed at the expense of minor loss in accuracy. We provide experimental results and detailed analysis to show the utility of our method. © 2006 IEEE.|
|Source Title:||2006 IEEE International Conference on Multimedia and Expo, ICME 2006 - Proceedings|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 6, 2017
WEB OF SCIENCETM
checked on Nov 18, 2017
checked on Dec 10, 2017
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.