Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/248143
Title: NON-PARAMETRIC 3D HAND SHAPE RECONSTRUCTION FROM MONOCULAR IMAGE
Authors: YU ZIWEI
ORCID iD:   orcid.org/0009-0003-6415-0251
Keywords: 3D Hand Pose Estimation, Hand Shape Reconstruction, Hand Object Reconstruction, self-supervised
Issue Date: 28-Sep-2023
Citation: YU ZIWEI (2023-09-28). NON-PARAMETRIC 3D HAND SHAPE RECONSTRUCTION FROM MONOCULAR IMAGE. ScholarBank@NUS Repository.
Abstract: This study focuses on 3D hand shape estimation from RGB input, crucial for human-computer interaction and AR/VR. Deep learning advancements have enhanced this field, often relying on parametric hand models like MANO. However, these models diverge from real hand surfaces and demand extensive annotations. To address this, non-parametric methods are explored, incorporating point cloud and UV map representations. A novel UV-map-based pipeline improves accuracy while considering realistic hand-object interactions. Template-based methods excel but require predefined object models. We propose a category-level 3D hand-object reconstruction framework, incorporating shape priors for unseen objects. Previous approaches fall into parametric or non-parametric categories, each with limitations. Our thesis integrates both for accurate reconstruction and introduces a VAE correction module. Additionally, a weakly-supervised pipeline leverages 3D joint labels, advancing 3D hand shape estimation further.
URI: https://scholarbank.nus.edu.sg/handle/10635/248143
Appears in Collections:Ph.D Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
Yu Ziwei.pdf36.93 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.