Robust Body Shape Correspondence with Anthropometric Landmarks - 22.17

Y. Jiao et al., "Robust Body Shape Correspondence with Anthropometric Landmarks", Proc. of 3DBODY.TECH 2022 - 13th Int. Conf. and Exh. on 3D Body Scanning and Processing Technologies, Lugano, Switzerland, 25-26 Oct. 2022, #17, https://doi.org/10.15221/22.17.

Title:

Robust Body Shape Correspondence with Anthropometric Landmarks

Authors:

Yibo JIAO 1, Chang SHU 2, Dinesh K. PAI 1

1 University of British Columbia, Vancouver BC, Canada;
2 National Research Council Canada, Canada

Abstract:

We propose a method to improve the robustness of state-of-art learning-based methods for finding point-to-point correspondences of 3D human models with anthropometric landmarks. Specifically, current deep learning-based methods generally focus on intrinsic, local, properties of body shapes, which lack extrinsic global information. Thus, these methods are challenged by matching ambiguities, for instance, due to the bilateral symmetry of human body shapes. We demonstrate our method with an unsupervised learning-based method, DeepShells. Our work introduces a landmark supervision method based on the Shells by adding linear soft constraints to minimize this problem that we term the "intrinsic feature ambiguity problem." To that end, we derive a simple but efficient pipeline that better distinguishes self-similarities yet has similar overall matching quality.

Keywords:

shape matching, deep learning, anthropometry

Full paper:

PDF

Presentation:

Details:

Proceedings: 3DBODY.TECH 2022, 25-26 Oct. 2022, Lugano, Switzerland
Paper id#: 17
DOI: 10.15221/22.17

License/Copyright notice

Proceedings: © Hometrica Consulting - Dr. Nicola D'Apuzzo, Switzerland, hometrica.ch.
Authors retain all rights to individual papers, which are licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
The papers appearing in the proceedings reflect the author's opinions. Their inclusion in the proceedings does not necessary constitute endorsement by the editor or by the publisher.


Note: click the + on the top left of the page to open/close the menu.