Reality, Autonomy, and Robot Experience (RARE) Lab

Mission

Our mission is to solve the grand challenges in human-centered robotics by designing and studying preferred robot interactions and capable robotic systems powered by AI.

Vision

We envision an era of interactive robots that provide superior robot experiences for humans to interact, collaborate, team up, and live with.


News

Learn RARE Lab’s latest research, outreach, and members’ achievements!


Ph.D. students Jingjing Li and Yixi Chen joined the RARE Lab. Welcome!

NSF Funding

The National Science Foundation (NSF) has awarded $411,578 to Dr. Zhao Han (PI) for “Collaborative Research: FRR: Enabling Robots with Adaptive Projector-Based AR in Dynamic Environments” with Dr. Arie Kaufman from Stony Brook University!

Congrats to RARE Lab PhD student Xiangfei Kong on being awarded the Rada Scholarship in Al and Healthcare for the 2025-2026 school year!

2025 OneUSF Summer Undergraduate Research Symposium

RARE Lab’s Bloom robot with an older-adults application, our preliminary work, was presented at the OneUSF Summer Undergraduate Research Symposium! Good job, Andrew, Rex, and Sofia!

2025 Summer CRA UR2PhD

Congrats to 11 undergraduate researchers on completion of the UR2PhD research course: Aarav, Paramveer, Thao, Aditi, Abrar, Sofia, Anjali, Rex, Miguel, and Thuc Anh!

Also, congrats, Amanual and Xiangfei, on finishing the Graduate Student Mentorship Training!

Summer 2025 Lab Lunch

June is here, and Summer starts. Welcome, new and old lab members!

Xiangfei Kong MS Thesis Defense

Congrats, Xiangfei Kong, for the successful defense of her MS thesis!!

RARE Lab at ICRA 2025

We presented RARE Lab’s preliminary works on explainable robot learning and storytelling robot at the top robotics conference, ICRA 2025, in Atlanta. Stay tuned for our full papers and more!

Research

Research focuses on interdisciplinary human-robot interaction (HRI), involving robotics, AI, augmented reality (AR), cognitive science, and psychology… See Research Overview →

Recent Publications

Implementing LLM-Integrated Storytelling Robot

Implementing LLM-Integrated Storytelling Robot

ICRA 2025 FMNS
Towards Embodied Agent Intent Explanation in Human-Robot Collaboration: ACT Error Analysis and Solution Conceptualization

Towards Embodied Agent Intent Explanation in Human-Robot Collaboration: ACT Error Analysis and Solution Conceptualization

ICRA 2025 HCRL
A Controller for Robots to Autonomously Control Fog Machine

A Controller for Robots to Autonomously Control Fog Machine

VAM-HRI 2025
Exploring Familiar Design Strategies to Explain Robot Vision Capabilities

Exploring Familiar Design Strategies to Explain Robot Vision Capabilities

XHRI 2025
Anywhere Projected AR for Robot Communication: A Mid-Air Fog Screen-Robot System

Anywhere Projected AR for Robot Communication: A Mid-Air Fog Screen-Robot System

HRI 2025
Introduction to the Special Issue on Artificial Intelligence for Human-Robot Interaction (AI-HRI)

Introduction to the Special Issue on Artificial Intelligence for Human-Robot Interaction (AI-HRI)

THRI
Do Results in Experiments with Virtual Robots in Augmented Reality Transfer To Physical Robots? An Experiment Design

Do Results in Experiments with Virtual Robots in Augmented Reality Transfer To Physical Robots? An Experiment Design

WYSD 2024
To Understand Indicators of Robots’ Vision Capabilities

To Understand Indicators of Robots’ Vision Capabilities

VAM-HRI 2024