One Body, Two Minds: Alternating VR Perspective During Remote Teleoperation of Supernumerary Limbs
Active Virtual RealityTeleoperationSupernumerary LimbsEmbodiment

One Body, Two Minds: Alternating VR Perspective During Remote Teleoperation of Supernumerary Limbs

This project explores how two users co-embody a single avatar with virtual supernumerary limbs in VR, introducing guest-driven perspective switching to balance coordination, comfort, and embodiment during remote teleoperation.

Overview

One Body, Two Minds explores how two remote users — a host and a guest — can co-embody a single virtual avatar equipped with virtual supernumerary limbs (VSLs) in VR. The host navigates and controls the avatar's primary arms, while the guest remotely teleoperates the back-mounted VSLs. A key challenge is viewpoint coupling: locking the guest's camera to the host's head causes disorientation, control loss, and cybersickness. We introduce guest-driven perspective switching between three viewpoints and evaluate their trade-offs in a controlled study with 24 pairs (N=48).

Vision

Inspired by shared embodiment scenarios like those in Pacific Rim, this project envisions a future where two people inhabit the same body to tackle tasks neither could handle alone. Emerging robotic platforms already enable remote operators to control humanoid bodies, but sharing a single viewpoint creates friction when the host moves and the guest manipulates. By decoupling visual perspective from physical coupling, we aim to make shared-body teleoperation feel natural, efficient, and comfortable — paving the way for collaborative robotic control, remote assistance, and skill-sharing across distance.

How It Works

Three Perspective Modes

  • Shared Embodied View (SEV): The baseline — the guest's camera is co-located with the host's head. The guest can rotate independently but not translate. This provides strong embodiment but causes discomfort during host locomotion.
  • Embedded Anchored View (EAV): A stabilised stereoscopic portal near the host's body. The guest sees a steady, body-anchored window with independent rotation, reducing visual instability during fine manipulation.
  • Out-of-Body View (OOB): A fully decoupled, 6-DoF third-person drone camera. The guest freely positions their viewpoint in the world, enabling spatial planning and navigation oversight independent of the host's movement.

Perspective Switching Mechanism

The guest can switch from Shared Embodied View to either Embedded Anchored View or Out-of-Body View at any time using VR controller input. Short cross-fades, input debounce, and on-screen indicators ensure smooth transitions. The host always remains in Shared Embodied View throughout.

VSL Stabilisation

To prevent host movement from disrupting VSL control, the system decouples VSL actuation from host head/body transforms when significant motion is detected, easing VSLs back to a neutral pose via Unity SmoothDamp.

Evaluation

A within-subjects study with 24 participant pairs (N=48) across two collaborative tasks (Transportation and Factory) measured:

  • Performance: Out-of-Body View yielded fewer errors in the Factory task with comparable completion times.
  • Workload & Fatigue: Guests reported higher subjective workload in Out-of-Body View but lower physiological stress (higher HRV).
  • Embodiment: Embedded Anchored View maintained stronger self-embodiment scores, while Out-of-Body View reduced embodiment.
  • Switching Behaviour: Participants preferred Embedded Anchored View for precision subtasks and Out-of-Body View for navigation and spatial planning.

Applications

  • Remote Collaborative Robotics: Enabling an expert to operate supernumerary manipulators on a remote worker's body with appropriate visual framing.
  • Telepresence & Training: Supporting skill transfer where a trainee (host) performs primary actions while a mentor (guest) assists via extra limbs.
  • Assistive Technology: Allowing caregivers to remotely co-control wearable robotic arms for people with motor impairments.
  • Multi-User VR Experiences: Designing shared-body experiences for entertainment, education, and collaborative design.

Related Publications

Conference / CHI '26 / 2026

One Body, Two Minds: Alternating VR Perspective During Remote Teleoperation of Supernumerary Limbs

H Zhou, X Huang, W Wijaya, YF Cheng, D Lindlbauer, E Velloso, A Bianchi, Z Sarsenbayeva, A Withana

Project Details

Timeline

Started: January 1, 2025

External Collaborators

  • Xincheng Huang (University of British Columbia)
  • Winston Wijaya (University of Sydney)
  • Yi Fei Cheng (Carnegie Mellon University)
  • David Lindlbauer (Carnegie Mellon University)
  • Andrea Bianchi (KAIST)

aid-lab

School of Computer Science

The University of Sydney

1 Cleveland St, Darlington NSW 2008, Australia

Contact

© 2026 aid-lab. All rights reserved.