A Novel 3D Sensing Framework for Safety Monitoring in Human-Robot Collaboration Work Cells

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published Oct 26, 2025
Tarek Yahia Kody Haubeil Alex Suer Yongzhi Qu Janet Dong Xiaodong Jia

Abstract

The demand for work safety protection in Human-Robot Interaction (HRI) work cells is rapidly increasing, driven by the projected 34.3% Compound Annual Growth Rate (CAGR) of the global Collaborative Robot (Cobot) market from 2020 to 2030 [1]. According to IRF-World Robotics 2023, it is reported that there are nearly 4 million industrial robots in operation worldwide, with approximately 10% of them being cobot [2]. A NIOSH report highlighted 61 robot-related fatalities between 1992 and 2015, with an expectation of further rising due to the increasing use of industrial robots and cobots in the US work environment [3]. A recent study in [4] delved into 355 robot accidents documented by KOSHA between 2009 and 2019, revealing that 95% occurred in manufacturing businesses. Pinch and crush incidents accounted for 52% of the accidents, while impacts and collisions accounted for 36%, and the remaining 12% involved falls, flying objects, trips/slips, cuts, burns, etc. These findings align with US data reported in [5].
The rising integration of cobot units among major manufacturers emphasizes the critical need for enhancing cobot safety in manufacturing. Owing to safety considerations and regulatory requirements, existing cobots frequently operate at significantly reduced speeds and are restricted from undertaking complex interaction tasks in shared workspace. This limitation has curtailed the full potential utilization and productivity of cobots in manufacturing. This paper introduces a novel 3D sensing framework designed to address these limitations by enabling safety assurance in workspaces requiring close human-robot interaction. The framework generates 3D human pose information and relays it to the robot for real-time safety monitoring. Our methodology begins with data collection from a single RGB-D camera capturing human-robot interactions in a manufacturing environment. Human shape and pose are predicted using deep neural networks, which then incorporate depth information and undergo 3D geometric transformations to deduce size, shape, and translation. This process produces a reconstructed 3D avatar with pose, size, and location. Following 3D human posture estimation, this data is then integrated into a virtual environment with a real robot for real-time monitoring. Results demonstrate successful reconstruction of 3D human geometry within human-robot collaboration settings. By integrating both the reconstructed mesh and real-time robot state into a unified virtual environment, we achieved real-time, offline, continuous monitoring of the critical distance between robot and human throughout operation. These distance measurements provide crucial data for developing collision detection, prediction, and avoidance capabilities when incorporated into the robot control feedback loop.

How to Cite

Yahia, T., Haubeil, K., Suer, A., Qu, Y., Dong, J., & Jia, X. (2025). A Novel 3D Sensing Framework for Safety Monitoring in Human-Robot Collaboration Work Cells. Annual Conference of the PHM Society, 17(1). https://doi.org/10.36001/phmconf.2025.v17i1.4371
Abstract 2 | PDF Downloads 0

##plugins.themes.bootstrap3.article.details##

Keywords

Human-Robot Collaboration, Machine Vision, Visuospatial Processing, Pose Estimation, Artificial Intelligence

References
Giallanza, A., La Scalia, G., Micale, R., & La Fata, C. M. (2024). Occupational health and safety issues in human-robot collaboration: State of the art and open challenges. Safety Science, 169, 106313. https://doi.org/10.1016/j.ssci.2023.106313
International Federation of Robotics (IFR). (2023). World Robotics 2023 Report: Asia ahead of Europe and the Americas. https://ifr.org/ifr-press-releases/news/world-robotics-2023-report-asia-ahead-of-europe-and-the-americas
National Institute for Occupational Safety and Health (NIOSH). (n.d.). Robotics. Centers for Disease Control and Prevention. https://www.cdc.gov/niosh/topics/robotics/aboutthecenter.html
Lee, K., Shin, J., & Lim, J.-Y. (2021). Critical hazard factors in the risk assessments of industrial robots: Causal analysis and case studies. Safety and Health at Work, 12(4), 496–504. https://doi.org/10.1016/j.shaw.2021.06.006
Jiang, B. C., & Gainer, C. A., Jr. (1987). A cause-and-effect analysis of robot accidents. Journal of Occupational Accidents, 9(1), 27–45. https://doi.org/10.1016/0376-6349(87)90018-6
Section
Technical Research Papers