Bare-Hand Interaction for XRverse

Network Intelligence Research Center
State Key Laboratory of Networking and Switching Technology Beijing University of Posts and Telecommunications

Today, Extened Reality (XR) systems often rely on device-based interactions, such as controllers or gloves, often face significant usability challenges. These devices can feel unnatural, requiring users to learn complex control schemes that disrupt immersion. Moreover, the reliance on hardware introduces logistical issues like cost, weight, and battery life, creating barriers for widespread adoption. For users, these devices may also lead to fatigue during prolonged sessions and restrict accessibility in scenarios where quick setup or minimal equipment is desirable. Such limitations hinder the vision of seamless, intuitive XR experiences and restrict XR's applicability across diverse fields such as education, healthcare, and entertainment.

MY ALT TEXT MY ALT TEXT MY ALT TEXT MY ALT TEXT
Figure1: The Demonstration of Bare-Hand Interaction.

Rather than sticking with traditional device-based interactions, we argue that a more intuitive and immersive approach lies in bare-hand interaction, where users engage with the XR environment using their natural hand gestures, without the need for additional physical tools. This shift from device-dependent to device-free interaction is becoming increasingly feasible thanks to advancements in hand tracking technology and computer vision, which allow for the detection of hand movements with high precision. Our research is focused on creating seamless bare-hand interaction systems for XR. These systems go beyond simply recognizing hand gestures. They aim to replicate the natural, fluid movements of human hands within virtual environments, offering a richer, more intuitive way to interact with digital spaces. This can involve a combination of visual, auditory, and haptic feedback, all working together to create a cohesive and engaging experience. The goal is to eliminate the need for controllers and allow users to directly manipulate objects, navigate environments, and communicate in ways that feel more natural and instinctive.

Whiteboard Collaboration

Our recent research has achieved notable advancements in VR bare-hand interaction, particularly in whiteboard collaboration within virtual environments. We addressed challenges in recognizing pen-drop, writing, and pen-lift intentions, proposing two methods: Air-writing, which adjusts the virtual whiteboard's position for consistent writing without physical surfaces, and Physical-writing, which uses passive haptic feedback by aligning the virtual whiteboard with real-world planes. User studies demonstrated an 8% improvement in communication efficiency over controllers, with Physical-writing achieving higher accuracy and user satisfaction. These results highlight the potential of bare-hand interaction to deliver immersive, natural, and efficient VR experiences. For more details, please refer to our paper (comming soon) or watch the video below.

MY ALT TEXT
Figure2: Bare-Hand Interaction vs Controller-based Interaction in Whiteboard Collaboration Teaching Scenarios.

Publication

"Towards Bare-Hand Interaction for Whiteboard Collaboration in Virtual Reality", ACM SIGCHI Conference on Computer-Supported Cooperative Work & Social Computing (CSCW) [CCF A], Bergen, Norway, Oct. 2025.

Video Figure