Please use this identifier to cite or link to this item: https://scholarbank.nus.edu.sg/handle/10635/231420
Title: A GAN-BASED HUMAN UV COORDINATES ESTIMATION
Authors: MAO HUIQI
Keywords: human uv estimation, GAN, virtual try-on, deep learning, artificial intelligence
Issue Date: 3-Mar-2022
Citation: MAO HUIQI (2022-03-03). A GAN-BASED HUMAN UV COORDINATES ESTIMATION. ScholarBank@NUS Repository.
Abstract: In image or video generation tasks that involve people, it is crucial to obtain accurate representations of the 3D human shape and appearance for efficient generation of modified content. Human UV coordinates estimation establishes correspondences between 3D human body surface representations and 2D texture pixels. It is widely used in image/video editing, augmented reality, and human-computer interaction. This thesis focuses on the accurate human UV coordinate estimation for loose clothing. Unlike in 3D human datasets, where people usually wear tight clothing, loose clothing is very common in real-life scenarios and can be tricky to be captured and generated by generation models. We proposed a novel method to capture loose clothing and hair with temporally coherent UV coordinates. The method uses a differentiable pipeline first to learn UV mapping between a sequence of RGB inputs and textures via UV coordinates. Then our method uses a GAN model trained to balance the spatial quality and temporal stability. Our experiments show that the trained model generates high-quality UV coordinates in extended clothing, and generalizes to new poses. The inferred UV coordinates sequence can be used to synthesize new looks and modified visual styles with a significantly less computational workload.
URI: https://scholarbank.nus.edu.sg/handle/10635/231420
Appears in Collections:Master's Theses (Open)

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
MaoHQ.pdf17.44 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.