Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/78943
Title: Affect Analysis in Video
Authors: XIANG XIAOHONG
Keywords: affect representation and modeling, home video presentation, subtle facial expression, social photo sharing
Issue Date: 24-Mar-2014
Citation: XIANG XIAOHONG (2014-03-24). Affect Analysis in Video. ScholarBank@NUS Repository.
Abstract: In this thesis, we first propose a computational framework to build the representation and model from the affective video content to the categorical emotional states, while developing a computational measure for the intensity of categorical emotional states. Specifically, a sparse vector representation is proposed in this computational framework. The intensity of emotion can be represented by the values computed from the sparse vector. Then, the modeling of affective content video addresses the problem of obtaining the representative sparse vectors based on the low-level features extracted from video. Then, we propose an approach that employs affective analysis to automatically create video presentations from home videos. Our novel method adaptively creates presentations for family, acquaintances and outsiders based on three properties: emotional tone, local main character and global main character. Besides the adaptive presentation of home videos, this thesis also exploits the affective analysis (facial expression cue), eye gaze data and previous emotional states to develop an online multimodal approach for estimating the subtle facial expression. Furthermore, this thesis also utilizes the affective analysis to propose a novel approach to share home photos based on the aesthetic, affective and social features.
URI: http://scholarbank.nus.edu.sg/handle/10635/78943
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
XiangXX.pdf14.83 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.