Evaluation of 3D Registration Deep Learning Methods using Iterative Transformation Estimations - 20.31
D. Bojanic et al., "Evaluation of 3D Registration Deep Learning Methods using Iterative Transformation Estimations", Proc. of 3DBODY.TECH 2020 - 11th Int. Conf. and Exh. on 3D Body Scanning and Processing Technologies, Online/Virtual, 17-18 Nov. 2020, #31, https://doi.org/10.15221/20.31.
Title:
Evaluation of 3D Registration Deep Learning Methods using Iterative Transformation Estimations
Authors:
David BOJANIC 1, Kristijan BARTOL 1, Tomislav PETKOVIC 1, Nicola D'APUZZO 2, Tomislav PRIBANIC 1
1 University of Zagreb, Faculty of Electrical Engineering and Computing, Croatia;
2 Hometrica Consulting, Ascona, Switzerland
Abstract:
3D registration is a process of aligning multiple three-dimensional (3D) data structures (such as point clouds or meshes) and merging them into one consistent and seamless 3D data structure. With the scope of 3D reconstruction, 3D human body scans from multiple views need to be registered into a single point cloud to create a seamless 3D representation.
Following current state-of-the-art deep learning approaches, we argue that an encoder-decoder approach, where the decoder part of the architecture uses a recursive layer that iteratively estimates the rigid transformation, should provide the best results. We adapt an approach created for the task of 3D segmentation called RSNets to the task of 3D registration and compare it to the current state-of-
the-art algorithm PCRNet.
Keywords:
3d computer vision, 3d registration, deep learning
Full paper:
Presentation:
Details:
Proceedings: 3DBODY.TECH 2020, 17-18 Nov. 2020, Online/Virtual
Paper id#: 31
DOI: 10.15221/20.31
License/Copyright notice
Proceedings: © Hometrica Consulting - Dr. Nicola D'Apuzzo, Switzerland, hometrica.ch.
Authors retain all rights to individual papers, which are licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
The papers appearing in the proceedings reflect the author's opinions. Their inclusion in the proceedings does not necessary constitute endorsement by the editor or by the publisher.
Note: click the + on the top left of the page to open/close the menu.