Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/43157
DC FieldValue
dc.titleRobust identification of gradual shot-transition types
dc.contributor.authorMittal, A.
dc.contributor.authorCheong, L.-F.
dc.contributor.authorTung Sing, L.
dc.date.accessioned2013-07-23T09:26:33Z
dc.date.available2013-07-23T09:26:33Z
dc.date.issued2002
dc.identifier.citationMittal, A.,Cheong, L.-F.,Tung Sing, L. (2002). Robust identification of gradual shot-transition types. IEEE International Conference on Image Processing 2 : II/413-II/416. ScholarBank@NUS Repository.
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/43157
dc.description.abstractMany video related applications require identification of shot transition and its type. Several algorithms for detecting both abrupt and gradual transitions exist. However, there is no integrated and robust framework for detecting transition types accurately. We show in this paper that combination of individual works by different researchers for identifying different transition types yields poor performance. An approach is presented using several algorithms working on patterns of effective average gradient, double chromatic difference etc. such that the approach is robust to false detection or false-alarms. A comparison with the standard approaches over 3 hours of experimental data on non-trivial video classes such as commercials, MTV, sports etc. shows the superiority of our approach.
dc.sourceScopus
dc.typeConference Paper
dc.contributor.departmentCOMPUTER SCIENCE
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.description.sourcetitleIEEE International Conference on Image Processing
dc.description.volume2
dc.description.pageII/413-II/416
dc.description.coden85QTA
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Page view(s)

86
checked on Dec 30, 2019

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.