Please use this identifier to cite or link to this item: https://doi.org/10.1109/CW.2006.6
Title: A framework for multiple-view product representation using augmented reality
Authors: Shen, Y. 
Ong, S.K. 
Nee, A.Y.C. 
Issue Date: 2006
Source: Shen, Y., Ong, S.K., Nee, A.Y.C. (2006). A framework for multiple-view product representation using augmented reality. 2006 International Conference on Cyberworlds, CW'06 : 157-164. ScholarBank@NUS Repository. https://doi.org/10.1109/CW.2006.6
Abstract: In this paper, an Augmented Reality (AR) collaborative design system to support multiple-view product representation in manufacturing is presented. This system can be used to enhance design discussion using the real objects and natural communication behaviors, such as voice inputs and gestures, and also support communication of a geographically dispersed multi-disciplinary team on a network in a concurrent engineering environment. In addition, during the life cycle development of a product, users performing different operations in this life cycle can view different information of the product that is relevant to their tasks through head-mounted devices (HMDs). The information, which is filtered using a client's interest list, i.e. the process information which is of the greatest interest to a client, and deposited in the users' respective databases, can be displayed using computer graphics such as video, text, etc. Furthermore, users can freely make changes to the product model in this AR environment. © 2006 IEEE.
Source Title: 2006 International Conference on Cyberworlds, CW'06
URI: http://scholarbank.nus.edu.sg/handle/10635/73022
ISBN: 0769526713
DOI: 10.1109/CW.2006.6
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

5
checked on Dec 6, 2017

WEB OF SCIENCETM
Citations

2
checked on Nov 21, 2017

Page view(s)

18
checked on Dec 10, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.