Please use this identifier to cite or link to this item:
Title: LUI: Lip in multimodal mobile GUI interaction
Authors: Azh, M.
Zhao, S. 
Keywords: Gesture input
Multimodal mobile interaction
Issue Date: 2012
Citation: Azh, M.,Zhao, S. (2012). LUI: Lip in multimodal mobile GUI interaction. ICMI'12 - Proceedings of the ACM International Conference on Multimodal Interaction : 551-553. ScholarBank@NUS Repository.
Abstract: Gesture based interactions are commonly used in mobile and ubiquitous environments. Multimodal interaction techniques use lip gestures to enhance speech recognition or control mouse movement on the screen. In this paper we extend the previous work to explore LUI: lip gestures as an alternative input technique for controlling the user interface elements in a ubiquitous environment. In addition to use lips to control cursor movement, we use lip gestures to control music players and activate menus. A LUI Motion- Action library is also provided to guide future interaction design using lip gestures. Copyright 2012 ACM.
Source Title: ICMI'12 - Proceedings of the ACM International Conference on Multimodal Interaction
ISBN: 9781450314671
DOI: 10.1145/2388676.2388792
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.


checked on May 15, 2019

Page view(s)

checked on May 17, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.