Digital Assessment of Anthropometric and Kinematic Parameters for the Individualization of Direct Human-Robot Collaborations - 16.171

D. Bonin et al., "Digital Assessment of Anthropometric and Kinematic Parameters for the Individualization of Direct Human-Robot Collaborations", in Proc. of 7th Int. Conf. on 3D Body Scanning Technologies, Lugano, Switzerland, 2016, pp. 171-181, https://doi.org/10.15221/16.171.

Title:

Digital Assessment of Anthropometric and Kinematic Parameters
for the Individualization of Direct Human-Robot Collaborations

Authors:

Dominik BONIN 1, Lukas STANKIEWICZ 2, Carsten THOMAS 3,
Jochen DEUSE 2, Bernd KUHLENKOETTER 3, Sascha WISCHNIEWSKI 1

1 Unit Human Factors, Ergonomics, Federal Institute for Occupational Safety and Health (BAuA),
Dortmund, Germany;
2 Institute of Production Systems, TU Dortmund University, Dortmund, Germany;
3 Chair of Production Systems, Ruhr-Universitaet Bochum, Bochum, Germany

Abstract:

For the human-centered design of ergonomic work systems, usually population based anthropometric percentile data tables, e.g. ISO 7250 are used. Due to the recent trend to complex individual and small-batch productions, there is an increasing product and process variability. Thus, the need of robots without separating devices that can work in direct interaction with humans increases (human-robot collaboration, HRC). This form of direct collaboration between humans and robots is a major challenge for a human-centered and safe workplace design. The development of sensor technology, data processing as well as the interconnectivity with collaborative robots generally enables a flexible adjustment of the robot's trajectory to the human prerequisites. Yet, to enhance the individualization of direct human-robot collaborations a more detailed knowledge of the anthropometric and kinematic profile of the employee would be beneficial. Manual anthropometric and kinematic measurements are time consuming and expensive and therefore not suitable as a standard process. To overcome this issue, the presented research project focusses on the optimization of this process by using markerless motion capturing. The processing and possible use of the captured data will be shown by the example of a use case, where the human parameters are used for a virtual simulation and optimization of a HRC-workplace. Afterwards, a self-written software tool for the Microsoft Kinect v2 sensor is presented for the digital assessment of anthropometric and kinematic human parameters. In the actual case, the anthropometric parameters are captured from a static T-Pose. For the determination of the kinematic profile the employee successively performs pre-defined max range of motion movements for each joint and degree of freedom. The motions were designed in line with the neutral-zero method.The data is stored in a comma separated value file. For the use with other systems it is possible to export the values with calculated offsets to the standard T-Pose. Further, preliminary results of a validation study for the digital assessment of the anthropometric parameters will be presented. The main objective of the presented work with markerless motion capturing is to enhance the digital collection of individual anthropometric and kinematic data. In addition, the possibilities and constraints for the use of these digital assessed parameters for customizable HRC-workplace designs are observed.

Details:

Full paper: 16.171.pdf
Proceedings: 3DBST 2016, 30 Nov.-1 Dec. 2016, Lugano, Switzerland
Pages: 171-181
DOI: 10.15221/16.171

License/Copyright notice:

Proceedings: © Hometrica Consulting - Dr. Nicola D'Apuzzo, Switzerland, hometrica.ch.
Authors retain all rights to individual papers, which are licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
The papers appearing in the proceedings reflect the author's opinions. Their inclusion in the proceedings does not necessary constitute endorsement by the editor or by the publisher.


Note: click the + on the top left of the page to open/close the menu.