Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/19245
DC FieldValue
dc.titleFacial expression animation based on conversational text
dc.contributor.authorHELGA MAZYAR
dc.date.accessioned2011-02-16T18:00:36Z
dc.date.available2011-02-16T18:00:36Z
dc.date.issued2009-05-29
dc.identifier.citationHELGA MAZYAR (2009-05-29). Facial expression animation based on conversational text. ScholarBank@NUS Repository.
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/19245
dc.description.abstractReal time expressive communication is important as it provides aspects of the visual clues that are present in face-to-face interaction but not available in text-based communications. In this Master thesis report, we propose a new text to facial expression system (T2FE) which is capable of making real time expressive communication based on short text.This text is in the form of conversational and informal text which is used commonly by user of online messaging systems. This system contains two main components: The first component is text processing component. The task of this component is to analyze text-based messages used in usual online messaging systems to detect the emotional sentences and specify the type of emotions conveyed by these sentences. Second component is the animation component and its task is to use detected emotional content to render relevant facial expressions. These animated facial expressions are presented on a sample 3D face model as the output of the system. The proposed system differs from existing T2FE systems by using fuzzy text classification to enable rendering facial expressions for mixed emotions. To find out if the rendered results are interesting and useful from the users point of view, we performed a user study and the results are provided in this report.In this report, first we study the main works done in the area of text classification and facial expression synthesis. Advantages and disadvantages of different techniques are presented to decide about the most suitable techniques for our T2FE system. The results of the two main components of this system as well as a discussion on the results are provided separately in this report. Also the results of the user study is presented . This user study is conducted to estimate if the potential users of such system find rendered animations effective and useful.
dc.language.isoen
dc.subjectComputer animation, Text processing , Emotion recognition, Human computer interaction
dc.typeThesis
dc.contributor.departmentCOMPUTER SCIENCE
dc.contributor.supervisorSIM MONG CHENG, TERENCE
dc.description.degreeMaster's
dc.description.degreeconferredMASTER OF SCIENCE
dc.identifier.isiutNOT_IN_WOS
Appears in Collections:Master's Theses (Open)

Show simple item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Helga_thesis.pdf1.12 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.