Research Areas · Application Domains · Robot Platforms · Projects
Directed by Dr. Zhao Han, the RARE Lab conducts research broadly in human-robot interaction (HRI), artificial intelligence (AI), augmented reality, and robotics. Our mission is to solve the grand challenges in human-centered robotics by designing and studying preferred robot interactions and capable robotic systems powered by AI. For example, we enable robots to communicate efficiently and explain themselves so they fluently collaborate and interact with humans.
As robots stand to benefit society, providing solutions to industry, our daily lives, and the aging population, our mission is essential for robots to gain more acceptance and retain these benefits. Particularly, we envision an era of interactive robots that provide superior robot experiences for humans to interact, collaborate, team up, and live with.

Preferred Human-Robot Interaction (HRI)
Designing, developing and evaluating preferred robot interactions with humans

AI-Enabled Robotics
Developing capable robotic systems powered by AI for human environments with navigation, perception, manipulation, and language capabilities

AI and Cognitive Science
Developing human-centered AI algorithms, machine learning and foundational models (LLMs), and cognitive models to facilitate HRI

Augmented Reality (AR)
Developing headworn and scalable projected AR for situated visualization
Research Areas
Some main research areas include
- robot explanations for transparent & trustworthy AI robotics systems
- persuasive social robots for domains like elderly, students & children
- situated augmented reality (AR) for visualization and replication
- collaborative mobile manipulation as a real-world evaluation testbed
- robot ethics for contributing to human social and moral ecosystems
- computational cognitive modeling for familiar interactions
Application Domains
Representative application domains include
- Unstructured environments, e.g., search & rescue, construction sites
- Assistive/social settings, e.g., elder care, education/children
- Industrial, e.g., collaborative manufacturing and assembling
- Disaster response and recovery, e.g., wildfire, hurricane
Current Projects
Mid-Air Projection
2023/8 ->
How can a robot communicate by projecting visualizations in the air when no flat surface exists?
Explaining Robots’ Vision Capabilities
2023/8->
How can we indicate the much narrower FoV for human-like robots?
Physical vs. AR Virtual Robot
2023/11 ->
Would people perceive virtual robots in augmented reality (AR) differently?
LLM-Enabled Story-Telling Robots
2024/1->
How would generative AI-enabled robots perform in improving college students’ mental health?

Preliminary work with student support in part funded by CRA UR2PhD.
Physical Protectives Against Robot Abuse
2024/6->
How would different protective indicators impact people’s perspective on the likelihood of a female robot being abused?
Dentists’ and Patients’ Perspectives of Robots
2024/9->
What are the opinions about dentistry robots from dentists and their patients?
Making AI Robots Explainable
2025/1->
How would we enable robots to explain false predictions from their policies learned from transformer-based foundational models and LLMs?
Social Robots for Older Adult Companionship
2025/5->
How would we design robot appearance and companion behavior for older adults to improve quality of life?

Preliminary work in part funded by USF Multi-User Research Capital Investment Award (MuRCIA), USF-HARC.
Healthy Recipe AI Robot Recommender
2025/5->
How would we enable AI robots to convince kids and parents of healthy recipes?

Preliminary work in part funded by USF Interdisciplinary Center Preparation Grants Full Proposal (ICPGs), Bio-FAIRCH.
Are you a student?
Want to collaborate or fund us?
Research Platforms
Robots & Equipment

Fetch
The 1.1-1.5m physically capable wheeled mobile manipulator (photo: mobile manipulation)

Pepper
The 1.2m wheeled humanoid robot with rich upper body language including hand expression (photo: dance activity)

Misty
The 35cm wheeled social robot (photo: story-telling with love emotion & arm movement)

SO-ARM 100
The robot arm for embodied AI or robot learning (photo: teleoperated to collect training data)

HoloLens
The optical see-through augmented reality (AR) display that allows situated virtual content onto our physical world

Meta Quest
The virtual reality (VR) display that enables immersive and engaging experiences