HRI 2026 — Proceedings of the 21st ACM/IEEE International Conference on Human-Robot Interaction, 2026

The RUSH Checklist: A Standardized Framework for Reporting User Studies in Human-Robot Interaction

Shruti Chandra, Katie Seaborn, Giulia Barbareschi, Wing-Yue Geoffrey Louie, Shelly Bagchi, Sara Cooper, Zhao Han, Daniel Tozadore

Theory Paper Honourable Mention (top 2 in theory track; top 16% in full papers, 21/128)
RUSH checklist

Abstract

Transparent and consistent reporting of user studies is essential for advancing scientific knowledge. In human-robot interaction (HRI), studies are often reported incompletely, even in top-tier venues, limiting proper evaluation, replication, and practical application of findings in practice. This study aimed to generate expert consensus on a reporting checklist for HRI user studies and provide a validated tool to improve transparency, reproducibility, and methodological rigor in the field, leading to easier translation of research into practice. A two-round Delphi study was conducted with 34 HRI experts from academia and industry from over 12 countries. An international panel of nine interdisciplinary experts first synthesized a preliminary list of 116 reporting items from the literature. Experts rated the importance of each item and provided qualitative feedback. Consensus was defined as 70% agreement, and items were iteratively refined through anonymous online surveys. Overall, consensus was achieved on 106 items, encompassing both essential and context-dependent elements in nine domains. The resulting RUSH checklist (Reporting User Studies in Human-Robot Interaction) provides the first community-endorsed, consensus-based reporting guideline for HRI user studies.


Posted