Why Does the Apparel Use Case Need a Different Approach to the Rig/Weight Systems? - 25.12

Title:

Why Does the Apparel Use Case Need a Different Approach to the Rig/Weight Systems?

Authors:

Katy SCHILDMEYER 1, Carol MCDONALD 2, Amelia SCHILDMEYER DC 3

1 Design Cycle Solutions, New York NY, USA;
2 Gneiss Concept, Washougal WA, USA;
3 Somaticpdx, Portland OR, USA

Keywords:

3D body scanning, rig systems, weighting, posture descriptors

Abstract:

Current rig systems have been developed for use in games, film, and virtual environments. Based on 2D images. The utilization of these rig systems impacts fashion waste. Examined rig systems are focused on art and speed related to imaginary environments. These systems unfortunately do not help the design use cases for apparel. Current rig systems are useful for understanding poses for humanoids. However, overlooking natural posture, shape, mass, movement, and curvature of humanoid(s) cause mathematical fidelity to suffer, and impacts the quality of digital garment displays, and therefore the fit accuracy of the modeled coveroids (garments or footwear). This is essential for humanoids based on body data of an actual humans obtained by 3D body scanning or data input. Utilizing the PARCS descriptors for posture (placement, alignment, rotation, curvature and symmetry), to accurately describe the rig that is obtained for humanoids, allows for a better understanding of differences between the modeled rig and actual skeletal requirements. In the rig/weight system presented, rigs can be adjusted for gender and/or body mass differences, as this is essential for intimate apparel design.
After creating the rig for the humanoid, weighting for the vertices follows. Weighting is the technique of designing how much influence each joint has on each mesh vertex. This binds the mesh skin to the joints (rig) and allows for mesh morphing alignment with the joints during movement. If the weighting is restricted to limit the number of vertices to each joint, the subtilty of movement cannot be fully described. In the weighting system presented, weighting can be graded for body mass or body movement.
Multi Person models use an average to configure and calculate folding variables and form a skin (mesh). However, the modeling of the fabric for coveroids needs to interact properly with the humanoid and coveroid. It is critical to be able to attach rig systems from the humanoid to the coveroid for proper movement of the coveroid, in addition to setting up the collision values that relate to the shape, mass, movement and curvature of the humanoid.
Apparel has a need for a new system that works as an apparel engineering tool versus a simulation tool for designers to reduce waste and deliver mathematically sound products. Appropriate programming for deep learning and AI is not possible without better data.

Full paper:

PDF

Presentation:

VIDEO will be available here in Q3.2026.
VIDEO availble in proceedings (purchase order)

How to Cite (MLA):

K. Schildmeyer et al., "Why Does the Apparel Use Case Need a Different Approach to the Rig/Weight Systems?", Proceedings of 3DBODY.TECH 2025 - 16th International Conference and Expo on 3D/4D Body Scanning, Data and Processing Technologies, Lugano, Switzerland, 21-22 Oct. 2025, #12, https://doi.org/10.15221/25.12

Details:

Proceedings: 3DBODY.TECH 2025, 21-22 Oct. 2025, Lugano, Switzerland
Paper/Presentation: #12
DOI: https://doi.org/10.15221/25.12

License/Copyright notice

Proceedings: © Hometrica Consulting - Dr. Nicola D'Apuzzo, Switzerland, hometrica.ch.
Authors retain all rights to individual papers, which are licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
The papers appearing in the proceedings reflect the author's opinions. Their inclusion in the proceedings does not necessary constitute endorsement by the editor or by the publisher.


Note: click the + on the top left of the page to open/close the menu.