Simple Science

Cutting edge science explained simply

# Computer Science# Human-Computer Interaction

Hands-Free Interaction in Augmented Reality

This study explores new methods for hands-free control of AR systems using AI.

― 5 min read


AI in Hands-Free ARAI in Hands-Free ARControlwith AR without hands.Exploring innovative ways to interact
Table of Contents

Augmented Reality (AR) technology is changing how we use devices in different fields, especially in healthcare and industry. With AR, we can add helpful information to what we see in the real world. This can make tasks easier and safer, helping users work better while using devices and systems. However, many tasks in these areas use both hands, which limits the ways people can control AR applications. This is a challenge for individuals who can't use their hands effectively, such as those with certain disabilities.

The focus of this study is to come up with new Hands-free ways of interacting with AR technology, using support from Artificial Intelligence (AI) to make the experience better for users.

Challenges of Existing Input Methods

In AR, people can use different input methods to control applications. One method is hand tracking, which lets users interact with virtual objects using their hands. However, this isn't always practical when both hands are needed for tasks. Other tools can be added, but they can be expensive or uncomfortable to use.

This study looks into how users can control AR applications without using their hands. By using natural movements, like tilting the head or giving voice commands, users can keep their hands free for other tasks. However, voice commands often don't work well in noisy environments.

New Input Methods

The research focuses on methods that use head movement along with image-based solutions. These new methods are checked against traditional input methods like a mouse or gamepad.

Related Research

Human-Computer Interaction (HCI) is crucial in AR. Many studies have aimed to figure out the best ways to interact with AR systems. For example, some research looked into how virtual keyboards can be positioned and how users can get feedback while typing. These methods use different positions for keyboard interaction based on the user's viewpoint or the placement of their non-dominant hand.

Other studies have challenged how to use freehand methods to manipulate 3D objects in virtual environments. Various approaches, like using head gaze, voice commands, and even foot movements, have been explored. However, some methods, like using feet, may not be practical for everyone.

Eye tracking is another emerging method in HCI, especially in tele-operation systems where users need to keep their eyes on multiple things at once. This is becoming popular in roles that require hands-free operation. Researchers are developing more affordable eye-tracking systems for tele-operation, which can help people with disabilities too.

Proposed AI-Supported Solutions

This study presents a system that uses AI support to help with hands-free operation in AR. When users look at a specific spot, they can receive extra information displayed on their device. How long someone needs to look to activate this feature is known as temporal activation, while spatial activation refers to how big the area is that they need to gaze at.

The devices being looked at can include smart glasses and even smartphones. These AR devices can present real-time information to workers, allowing them to access important data while keeping their hands free.

This research looks at cases where hands can't be used due to the nature of the work or because of physical disabilities. In noisy environments, voice commands may not be reliable, and hands-free options become essential.

Methodology for Testing

To test these new methods, a study was conducted involving 20 participants using a special application. The app provided three modes with various tasks to complete.

  1. Locate Mode: Participants aimed to point a crosshair at static targets in a 3D environment. The aim was to measure how quickly they could reach the target.

  2. Select Mode: Here, participants needed to hold the crosshair on a target for a set time. This mode tested the accuracy of different input methods.

  3. Follow Mode: In this mode, participants tracked moving targets to evaluate how well they could keep up with dynamic movements.

Metrics for Evaluation

Different metrics were recorded to evaluate performance in each mode.

  • In Locate Mode, the average time to reach a target and the success rate were measured.
  • In Select Mode, extra time taken to select targets accurately was recorded.
  • In Follow Mode, the ability to track moving targets was analyzed, focusing on how well participants kept in contact with the target.

Data was collected in tables showing the efficiency of methods with and without AI support.

Results

Results showed that traditional input methods, like the mouse, performed best overall, but the AI-supported methods like head movement tracking with Gravity-Map assistance showed promise.

In Locate Mode, the mouse outperformed all others by a significant margin, although head movement solutions with AI support were not far behind.

In Select Mode, the mouse again had the best performance, but head-based input using Gravity-Map support improved significantly, leading to a better overall experience.

Follow Mode revealed that the head movement with Gravity-Map support also performed well, showing that this method can be an effective alternative to traditional input devices.

Discussion

The study confirms that while traditional input devices are still dominant, using AI to enhance alternative input methods can significantly improve their usability. Head tracking, in particular, with AI support, provides a realistic way to interact with AR environments.

Participants who were new to head-tracking showed that as they got used to the system, their performance was expected to improve. Familiarity with technology can often lead to better results.

The findings confirm the potential for AI-assisted inputs to create natural interactions in AR, especially where traditional methods may not be feasible.

Conclusion

In conclusion, there is a strong potential for AI-supported methods to enhance how we interact with AR systems. This study shows that hands-free options, especially those using head tracking, can match the effectiveness of traditional input devices. As AR technology continues to grow, exploring these alternatives can make it more accessible for various users, including those with physical disabilities.

Funding from the European Union's Horizon Europe program highlights the importance of this research in promoting innovation in AR technology.

More from authors

Similar Articles