Research

Directed by Dr. Zhao Han, the RARE Lab conducts research broadly in human-robot interaction (HRI), artificial intelligence (AI), augmented reality, and robotics. Our mission is to solve the grand challenges in human-centered robotics by designing and studying preferred robot interactions and capable robotic systems powered by AI. For example, we enable robots to communicate efficiently and explain themselves so they fluently collaborate and interact with humans.

As robots stand to benefit society, providing solutions to industry, our daily lives, and the aging population, our mission is essential for robots to gain more acceptance and retain these benefits. Particularly, we envision an era of interactive robots that provide superior robot experiences for humans to interact, collaborate, team up, and live with.

Learn about RARE Lab’s research areas, robot platforms, and projects below.


4 Baxter robot released the object and the participant took it scaled 1

Preferred Human-Robot Interaction (HRI)


Designing, developing and evaluating preferred robot interactions with humans

fetchit Fetch inserted a large gear

AI-Enabled Robotics


Developing capable robotic systems powered by AI for human environments with navigation, perception, manipulation, and language capabilities

Givenness Hierarchy Informed Optimal Sentence Planning for Situated Human Robot Interaction

AI and Cognitive Science


Developing human-centered AI algorithms, machine learning and foundational models (LLMs), and cognitive models to facilitate HRI

nav path to

Augmented Reality (AR)


Developing headworn and scalable projected AR for situated visualization

Research Areas

Some main research areas include

Current Projects


Mid-Air Projection for Robot Communication

2023/8 ->

How can a robot communicate by projecting visualizations in the air when no flat surface exists?

Explaining Robots’ Vision Capabilities

2023/8->

How can we indicate the different FoV for human-like robots?


Physical vs. AR Virtual Robot

2023/11 ->

Would people perceive virtual robots in augmented reality (AR) differently?

LLM-Enabled Story-Telling Robots

2024/1->

How would generative AI-enabled robots perform in improving college students’ mental health?


Physical Protectives to Mitigate Robot Abuse

2024/6->

How would different protective indicators impact people’s perspective on the likelihood of a female robot being abused?

Dentists’ and Patients’ Perspectives of Robots

2024/9->

What are the opinions about robots in dentists’ offices from dentists and their patients?


Making AI-Enabled Robots Explainable

2025/1->

How would we enable robots to explain false predictions from their policies learned from transformer-based foundational models and LLMs?


Research Platforms

Robots & Equipment

fetch grasped a large gear

Fetch

The 1.1-1.5m physically capable wheeled mobile manipulator (photo: mobile manipulation)

g1 stairs

G1 (to arrive)

The 1.32m legged humanoid robot that works in any human environment (via USF HARC)

pepper dance

Pepper

The 1.2m wheeled humanoid robot with rich upper body language including hand expression (photo: dance activity)

go2

Go2 (to arrive)

The 0.7m zoomorphic robot dog (via USF HARC)

moxie robot

Moxie

The 38cm social robot with sophisticated multimodal emotional expression (with Dr. Fan Yang)

moxie storytelling

Misty

The 35cm wheeled social robot (photo: story-telling with love emotion & arm movement)

so arm100 extended 1

SO-ARM 100

The robot arm for embodied AI or robot learning (photo: teleoperated to collect training data)

HoloLens2

HoloLens 2

The optical see-through augmented reality display that allows situated virtual content onto our physical world