Please use this identifier to cite or link to this item:
|Title:||LUI: Lip in multimodal mobile GUI interaction|
Multimodal mobile interaction
|Citation:||Azh, M.,Zhao, S. (2012). LUI: Lip in multimodal mobile GUI interaction. ICMI'12 - Proceedings of the ACM International Conference on Multimodal Interaction : 551-553. ScholarBank@NUS Repository. https://doi.org/10.1145/2388676.2388792|
|Abstract:||Gesture based interactions are commonly used in mobile and ubiquitous environments. Multimodal interaction techniques use lip gestures to enhance speech recognition or control mouse movement on the screen. In this paper we extend the previous work to explore LUI: lip gestures as an alternative input technique for controlling the user interface elements in a ubiquitous environment. In addition to use lips to control cursor movement, we use lip gestures to control music players and activate menus. A LUI Motion- Action library is also provided to guide future interaction design using lip gestures. Copyright 2012 ACM.|
|Source Title:||ICMI'12 - Proceedings of the ACM International Conference on Multimodal Interaction|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 22, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.