Human-Robot Teaming study

2023-present

Introduction

Following our implementation of the spatial augmented reality (SAR) system on the scooter project [1], [2], [3], [4] as a means for operating a manipulator, we wanted to explore other applications of this system. We created implementations of this system to communicate intent and improve robot explainability for a manipulator [5] and a mobile robot [6].

Driven by a desire to understand how this type of intent communication can improve interactions between humans and mobile robots, we set out with the goal of designing a study to measure how using SAR to communicate a mobile robot’s intent can improve the throughput of a human-robot team.

Robot implementation

We implemented our SAR system on a mobile robot with several improvements over our prior work:

We initially started with a Pioneer 3-DX mobile robot base, but found that we were limited by the robot’s payload capacity and wanted to mount the projector higher up, so we ended up using a Fetch mobile manipulator. The projector was mounted above its head, and the rendering of the projected image was handled by an NVIDIA Jetson AGX Xavier. A wide-angle photography lens was mounted in front of the front element of the Epson EF12 projector with a 3D-printed bracket. Power to the projector and computer is provided by a lithium-ion battery bank separate from Fetch’s onboard batteries, enabling a runtime of about 2 hours, with a power consumption of approximately 100 W.

Path projection demo

This video shows the robot’s intended path being projected onto the floor as a green line. The robot’s goal is shown as a larger green circle at the end of the path.

Study design

Our study intends to investigate how our system improves the throughput of a human-robot team compared to a control where the robot does not communicate its intent. We constructed a 2m x 4m workspace with a series of goals placed around the perimeter. The human and robot are assigned goals synchronously, intentionally creating certain types of path conflicts through the careful selection of goals. A smartphone app guides the human to each of the goals, and prompts them to scan a QR code at the goal.

The shared workspace, with goals around its periphery
A goal — lifting a flap reveals a QR code to be scanned with the smartphone appThe smartphone app, prompting the participant to scan the QR code at goal B

A ROS node assigns goals to both the human and robot. We developed a smartphone app that queries a REST API integrated into the goal assignment node to assign goals to the user. Each completed goal is logged to a MongoDB database with its duration and goal ID, which allows us to calculate completion times for different types of interactions.

The execution of this study design remains future work as of the time of writing.

References

[1]
D. Wang et al., “Towards assistive robotic pick and place in open world environments,” in The International Symposium of Robotics Research, Springer, 2019, pp. 360–375.
[2]
A. Wilkinson, A. Sinclaire, and H. Yanco, “Spatial Augmented Reality User Interface for Assistive Robot manipulation,” in ACM/IEEE HRI 2023 Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI), 2023.
[3]
A. Wilkinson et al., “Design guidelines for human–robot interaction with assistive robot manipulation systems,” Paladyn, vol. 12, no. 1, pp. 392–401, 2021.
[4]
A. Sinclaire, A. Wilkinson, B. Kim, and H. A. Yanco, “Comparison of User Interface Paradigms for Assistive Robotic Manipulators,” in 2025 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2025.
[5]
Z. Han, A. Wilkinson, J. Parrillo, J. Allspaw, and H. A. Yanco, “Projection Mapping Implementation: Enabling Direct Externalization of Perception Results and Action Intent to Improve Robot Explainability,” in Proceedings of the AI-HRI Symposium at AAAI-FSS 2020, 2020.
[6]
Z. Han, J. Parrillo, A. Wilkinson, H. A. Yanco, and T. Williams, “Projecting robot navigation paths: Hardware and software for projected AR,” in 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), IEEE, 2022, pp. 623–628.
[7]
S. Audet and M. Okutomi, “A user-friendly method to geometrically calibrate projector-camera systems,” in 2009 IEEE computer society conference on computer vision and pattern recognition workshops, IEEE, 2009, pp. 47–54.