Simple Science

Cutting edge science explained simply

# Computer Science# Robotics

Improving Safety in Human-Robot Workspaces

Combining 3D cameras and IMUs ensures safe interactions between humans and robots.

― 4 min read


Safer Human-RobotSafer Human-RobotCooperationefficiency in shared workspaces.Combining sensors improves safety and
Table of Contents

Robots are becoming more common in workplaces, working alongside humans. However, there are safety concerns regarding how humans and robots interact in the same space. A key challenge is accurately sensing where a person is and how they are positioned. This information is essential for ensuring that robots can operate safely without bumping into workers.

The current systems for robot and human cooperation often struggle with accurately Tracking human movements. Occasional obstacles, like people moving in front of cameras, can disrupt the tracking, making it hard to ensure safety. This paper discusses a solution that combines two types of tracking technologies: a 3D camera system and small Sensors worn on the body called Inertial Measurement Units (IMUs).

The Need for Reliable Tracking

In workplaces where robots and humans share tasks, knowing where each person is at all times is crucial. Robots need to be able to stop or slow down if a person gets too close. A reliable system for detecting a person’s Position helps calculate safe distances to avoid accidents. Moreover, understanding gestures and movements can lead to more effective communication between humans and robots.

While there are various systems available for tracking people, they often rely on one method, which can lead to issues. For example, camera systems can lose sight of a person due to obstacles, leading to gaps in information. On the other hand, IMUs might drift over time, making their readings less reliable.

Merging Technologies for Better Results

The proposed solution combines a 3D vision sensor with IMUs placed on the human body. The 3D camera tracks the person's movements, while the IMUs help fill in any gaps when the camera loses sight of them. When a person moves, the IMUs work to keep track of their position, even if the 3D camera cannot see them at that moment.

To keep the data accurate, the system continuously checks and adjusts for any errors in the IMU readings. This method improves tracking accuracy and ensures that the robot can operate safely without risking a collision with the human partner.

Applications in Real Life

In practice, this system can be used in various environments where humans and robots work together. For example, it can assist in assembly lines where a worker may need to reach for tools or parts. If a worker drops something, the system helps the robot detect their movements and react appropriately, such as moving away to give the worker space.

The tests conducted showed promising results. The technology was able to track the arm's position with a very small error, even when the 3D camera couldn’t see it. The IMUs provided a steady stream of information that helped maintain continuous tracking.

Challenges with Tracking Technologies

While the combination of 3D cameras and IMUs appears to work well, there are still challenges to address. For instance, the cost of IMUs can vary, and using cheaper models might lead to less accurate readings. Additionally, both systems must be correctly set up and maintained to ensure they function effectively in real-time.

Future Developments

Looking ahead, there are plans to improve the current system further. One goal is to extend the tracking capabilities to the entire body instead of just individual limbs. This would make it possible to understand better how people move and interact in shared spaces with robots.

Another area of focus is integrating these sensor technologies into everyday clothing. This would make tracking even more natural, as people wouldn’t have to wear cumbersome devices.

Conclusion

Reliable human tracking is vital in environments where robots and humans work together. Combining 3D vision sensors with IMUs has shown promise in improving tracking accuracy and safety. This approach ensures better interactions between humans and robots, making workplaces safer and more efficient.

As technology progresses, ongoing research will continue to refine these systems, making them more effective in real-world applications. The goal is to create safe and intuitive environments where humans and robots can collaborate seamlessly.

Original Source

Title: Robust human position estimation in cooperative robotic cells

Abstract: Robots are increasingly present in our lives, sharing the workspace and tasks with human co-workers. However, existing interfaces for human-robot interaction / cooperation (HRI/C) have limited levels of intuitiveness to use and safety is a major concern when humans and robots share the same workspace. Many times, this is due to the lack of a reliable estimation of the human pose in space which is the primary input to calculate the human-robot minimum distance (required for safety and collision avoidance) and HRI/C featuring machine learning algorithms classifying human behaviours / gestures. Each sensor type has its own characteristics resulting in problems such as occlusions (vision) and drift (inertial) when used in an isolated fashion. In this paper, it is proposed a combined system that merges the human tracking provided by a 3D vision sensor with the pose estimation provided by a set of inertial measurement units (IMUs) placed in human body limbs. The IMUs compensate the gaps in occluded areas to have tracking continuity. To mitigate the lingering effects of the IMU offset we propose a continuous online calculation of the offset value. Experimental tests were designed to simulate human motion in a human-robot collaborative environment where the robot moves away to avoid unexpected collisions with de human. Results indicate that our approach is able to capture the human\textsc's position, for example the forearm, with a precision in the millimetre range and robustness to occlusions.

Authors: António Amorim, Diana Guimarães, Tiago Mendonça, Pedro Neto, Paulo Costa, António Paulo Moreira

Last Update: 2023-04-17 00:00:00

Language: English

Source URL: https://arxiv.org/abs/2304.08379

Source PDF: https://arxiv.org/pdf/2304.08379

Licence: https://creativecommons.org/licenses/by/4.0/

Changes: This summary was created with assistance from AI and may have inaccuracies. For accurate information, please refer to the original source documents linked here.

Thank you to arxiv for use of its open access interoperability.

More from authors

Similar Articles