Please use this identifier to cite or link to this item: http://scholarbank.nus.edu.sg/handle/10635/13454
Title: Automated human activity recognition in smart room
Authors: HENRY TAN@TAN CHIN CHYE
Keywords: Human activity recognition, human head tracking, spatial temporal pattern recognition, digital color image sequence analysis, Elman, NN & HMM hybrids
Issue Date: 29-Dec-2003
Source: HENRY TAN@TAN CHIN CHYE (2003-12-29). Automated human activity recognition in smart room. ScholarBank@NUS Repository.
Abstract: Traditional human activity recognition methods are the statistical pattern recognition techniques, e.g. Nearest Neighbor Rule (NNR), and the state-space methods, e.g. the Hidden Markov Model (HMM). We propose three novel connectionist-based approaches a?? the use of the Elman Network (EN), and two hybrids of Neural Network (NN) and HMM, i.e. HMM-NN and NN-HMM, to recognize ten distinct human activities in a smart room environment. A three-level framework has also been suggested, which first detects and verifies the presence of a person, then tracks the subjecta??s head movement over consecutive frames to extract the difference in coordinates as the feature vector that is invariant to the persona??s sex, race and physique, and finally classifies the activities performed using the three proposed classifiers. Recognition and time-complexity comparisons show that all the three proposed classifiers perform more robustly than the conventional methods and demonstrate their greater potential in realizing recognition of continuous and complex activity in the increasingly popular human-activity-based applications.
URI: http://scholarbank.nus.edu.sg/handle/10635/13454
Appears in Collections:Master's Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
HenryTan_MEng_ECE_2003.pdf5.49 MBAdobe PDF

OPEN

NoneView/Download

Page view(s)

266
checked on Dec 11, 2017

Download(s)

181
checked on Dec 11, 2017

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.