Directed by Dr. Zhao Han, the RARE Lab conducts research broadly in human-robot interaction (HRI), artificial intelligence (AI), augmented reality, and robotics. Our mission is to solve the grand challenges in human-centered robotics by designing and studying preferred robot interactions and capable robotic systems powered by AI. For example, we enable robots to communicate efficiently and explain themselves so they fluently collaborate and interact with humans.
As robots stand to benefit society, providing solutions to industry, our daily lives, and the aging population, our mission is essential for robots to gain more acceptance and retain these benefits. Particularly, we envision an era of interactive robots that provide superior robot experiences for humans to interact, collaborate, team up, and live with.
Learn about RARE Lab’s research areas, robot platforms, and projects below.

Preferred Human-Robot Interaction (HRI)
Designing, developing and evaluating preferred robot interactions with humans

AI-Enabled Robotics
Developing capable robotic systems powered by AI for human environments with navigation, perception, manipulation, and language capabilities

AI and Cognitive Science
Developing human-centered AI algorithms, machine learning and foundational models (LLMs), and cognitive models to facilitate HRI

Augmented Reality (AR)
Developing headworn and scalable projected AR for situated visualization
Research Areas
Some main research areas include
- robot explanations for transparent & trustworthy AI robotics systems
- persuasive social robots for domains like elderly, students & children
- augmented reality (AR) for situated visualization and replication
- collaborative mobile manipulation as a real-world evaluation testbed
- robot ethics for contributing to human social and moral ecosystems
- computational cognitive modeling for familiar interactions with humans
Current Projects
Mid-Air Projection for Robot Communication
2023/8 ->
How can a robot communicate by projecting visualizations in the air when no flat surface exists?
Explaining Robots’ Vision Capabilities
2023/8->
How can we indicate the different FoV for human-like robots?
Physical vs. AR Virtual Robot
2023/11 ->
Would people perceive virtual robots in augmented reality (AR) differently?
LLM-Enabled Story-Telling Robots
2024/1->
How would generative AI-enabled robots perform in improving college students’ mental health?
Physical Protectives to Mitigate Robot Abuse
2024/6->
How would different protective indicators impact people’s perspective on the likelihood of a female robot being abused?
Dentists’ and Patients’ Perspectives of Robots
2024/9->
What are the opinions about robots in dentists’ offices from dentists and their patients?
Making AI-Enabled Robots Explainable
2025/1->
How would we enable robots to explain false predictions from their policies learned from transformer-based foundational models and LLMs?
Research Platforms
Robots & Equipment

Fetch
The 1.1-1.5m physically capable wheeled mobile manipulator (photo: mobile manipulation)

Pepper
The 1.2m wheeled humanoid robot with rich upper body language including hand expression (photo: dance activity)

Misty
The 35cm wheeled social robot (photo: story-telling with love emotion & arm movement)

SO-ARM 100
The robot arm for embodied AI or robot learning (photo: teleoperated to collect training data)

HoloLens 2
The optical see-through augmented reality display that allows situated virtual content onto our physical world