Master's Thesis: Development and Evaluation of Supernumerary Robotic Limb Control Strategies and Sense of Embodiment in a Virtual Reality Environment
A comprehensive user study in a custom-built UE5 environment, investigating how wearable robotic limbs compare to external cobots and how users can feel 'in control' of an autonomous partner.
At a Glance
- Situation: Research into Supernumerary Robotic Limbs (SRLs) is hampered by the lack of standardized, cost-effective platforms for testing control schemes and measuring user embodiment.
- Task: To design, develop, and validate a modular Virtual Reality testbed in Unreal Engine 5 to systematically compare different robotic assistance paradigms and their effect on user performance, workload, and subjective experience.
- Action: I built a complete VR application from scratch, implementing realistic physics-based assembly mechanics and multiple SRL control modes. The core of the thesis was a user study (N=24) comparing unassisted performance against collaboration with an autonomous SRL and an autonomous external cobot, both driven by a Finite State Machine (FSM).
- Result: The study proved that robotic assistance significantly improved performance and reduced workload. Crucially, it revealed a clear user preference for the wearable SRL, which was strongly correlated with a higher sense of embodiment (Ownership and Agency), demonstrating that the form factor of a robot is a key driver of user satisfaction.
Technical Deep Dive
Platform & Architecture (Unreal Engine 5)
The entire simulation was built from the ground up in Unreal Engine 5. UE5 was chosen for its advanced features critical to this research:
- High-Fidelity Physics Engine: Essential for simulating realistic object interactions, collisions, and dynamic behavior required for the complex assembly task.
- Blueprints Visual Scripting: Allowed for rapid prototyping and iteration of complex interaction logic and control strategies without lengthy C++ compilation cycles.
- Native OpenXR Support: Ensured broad compatibility with VR hardware like the Meta Quest 3 and followed industry standards for XR development.
The core architecture was modular, centered on a VR Pawn that served as the user's avatar. The Supernumerary Robotic Limbs were implemented as attachable components, with their movement driven by Inverse Kinematics (IK) for natural motion.
Core Implementation: Interaction & Assembly Mechanics
A significant portion of the work was dedicated to creating a robust and believable interaction system for the virtual assembly task (building a stool).
- Physics-Based Assembly: The attachment mechanic relied on a system of box collisions and pre-placed helper meshes. When a user brought a component close to its target location, it would "snap" into place. A key challenge was solving physics inheritance issues to ensure that once parts were connected, they moved as a single rigid body.
- Natural Grasping: To prevent visual disconnects during the snapping process, the virtual hand's position was smoothly interpolated (lerped) to maintain a continuous grasp on the object, enhancing the sense of presence.
- Functional Virtual Tools: Tools like a hammer, screwdriver, and Allen key were implemented with custom logic. The hammer's function was based on impact velocity, while the screwdriver and Allen key used physics constraints to lock onto a screw and only allow for the correct rotational motion.
Core Implementation: Control & Autonomy
To demonstrate the platform's flexibility, several control modes were implemented, with the main study focusing on a task-aware autonomous system.
- FSM-Based Autonomy: The autonomous behavior of both the SRL and the external cobot was governed by a Finite State Machine (FSM) created using the Logic Driver Lite plugin. This FSM executed pre-defined, multi-step action sequences, such as the "Reach-Grab-Bring Sequence," which commanded the robot to retrieve a required part or tool and bring it to a convenient hand-off location for the user, anticipating their needs.
- Other Implemented Modes: To showcase the testbed's modularity, other manual control modes were also developed, including direct control via polar coordinates, a one-to-one "Mirroring" mode, and an advanced "Retargeting" mode based on the virtual fixture concept.
Experiment Design & Data Analysis
A formal user study was the centerpiece of the thesis. A rich set of both objective and subjective data was collected and analyzed using custom Python scripts with the pandas and SciPy libraries.
- Objective Metrics: The system logged detailed kinematic data, including 3D coordinates of the user's hands. From this, I calculated advanced metrics like total path length, average movement speed, and average jerk (a measure of movement smoothness). Workspace volume was also computed using a Convex Hull algorithm to quantify the user's interaction space.
- Subjective Metrics: Validated questionnaires such as the NASA-TLX (workload), UEQ-S (user experience), and the Virtual Embodiment Questionnaire (VEQ) were used to capture the user's subjective experience.