Please use this identifier to cite or link to this item: https://doi.org/10.1007/s11548-013-0897-4
DC FieldValue
dc.titleProjection-based visual guidance for robot-aided RF needle insertion
dc.contributor.authorWen, R.
dc.contributor.authorChui, C.-K.
dc.contributor.authorOng, S.-H.
dc.contributor.authorLim, K.-B.
dc.contributor.authorChang, S.K.-Y.
dc.date.accessioned2014-06-17T03:02:39Z
dc.date.available2014-06-17T03:02:39Z
dc.date.issued2013-11
dc.identifier.citationWen, R., Chui, C.-K., Ong, S.-H., Lim, K.-B., Chang, S.K.-Y. (2013-11). Projection-based visual guidance for robot-aided RF needle insertion. International Journal of Computer Assisted Radiology and Surgery 8 (6) : 1015-1025. ScholarBank@NUS Repository. https://doi.org/10.1007/s11548-013-0897-4
dc.identifier.issn18616410
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/57133
dc.description.abstractPurpose: The use of projector-based augmented reality (AR) in surgery may enable surgeons to directly view anatomical models and surgical data from the patient's surface (skin). It has the advantages of a consistent viewing focus on the patient, an extended field of view and augmented interaction. This paper presents an AR guidance mechanism with a projector-camera system to provide the surgeon with direct visual feedback for supervision of robotic needle insertion in radiofrequency (RF) ablation treatment. Methods: The registration of target organ models to specific positions on the patient body is performed using a surface-matching algorithm and point-based registration. An algorithm based on the extended Kalman filter and spatial transformation is used to intraoperatively compute the virtual needle's depth in the patient's body for AR display. Results: Experiments of this AR system on a mannequin were conducted to evaluate AR visualization and accuracy of virtual RF needle insertion. The average accuracy of 1.86 mm for virtual needle insertion met the clinical requirement of 2 mm or better. The feasibility of augmented interaction with a surgical robot using the proposed open AR interface with active visual feedback was demonstrated. Conclusions: The experimental results demonstrate that this guidance system is effective in assisting a surgeon to perform a robot-assisted radiofrequency ablation procedure. The novelty of the work lies in establishing a navigational procedure for percutaneous surgical augmented intervention integrating a projection-based AR guidance and robotic implementation for surgical needle insertion. © 2013 CARS.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1007/s11548-013-0897-4
dc.sourceScopus
dc.subjectAugmented interaction
dc.subjectAugmented reality
dc.subjectImage-guided surgery
dc.subjectProjector-camera system
dc.subjectRadiofrequency ablation
dc.subjectVisual guidance
dc.typeArticle
dc.contributor.departmentMECHANICAL ENGINEERING
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.description.doi10.1007/s11548-013-0897-4
dc.description.sourcetitleInternational Journal of Computer Assisted Radiology and Surgery
dc.description.volume8
dc.description.issue6
dc.description.page1015-1025
dc.identifier.isiut000326455900015
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.