3D model reconstruction from a single image has achieved great progress with the recent deep generative models. However, conventional reconstruction approaches with template mesh deformation and implicit fields have difficulty reconstructing non-watertight 3D mesh models, such as garments. In contrast to image-based modeling, the sketch-based approach can help users generate 3D models to meet the design intentions of hand-drawn sketches. In this study, we propose Sketch2Cloth, a sketch-based 3D garment generation system that uses the unsigned distance fields from the user's sketch input. Sketch2Cloth first estimates the unsigned distance function of the target 3D model from the sketch input and then extracts the mesh from the estimated field with Marching Cubes. We also provide a model editing function to modify the generated mesh. We verified the proposed Sketch2Cloth with quantitative evaluations of garment generation and editing in comparison to a state-of-the-art approach.
Miguel FainsteinViviana SilessEmmanuel Iarussi
Yutao LiuLi WangJie YangWeikai ChenXiaoxu MengBo YangLin Gao
Yutao LiuLi WangJie YangWeikai ChenXiaoxu MengBo YangLin Gao