Videos


Evaluating an Assistive-Feeding Robot with Users with Mobility Limitations

In this work, we explore user preferences for different modes of autonomy for robot-assisted feeding given perceived error risks and also analyze the effect of input modalities on technology acceptance.
Specifically, we tested: Speed(Fast vs. Slow), Interface (Web-based vs. Voice-based), Environment (Social vs. Individual), Level of Autonomy: (Full vs. Partial vs. Low)


Online Learning for Food Manipulation

This video shows how a robot can learn to skewer previously-unseen food items with different action distributions using online learning with a contextual bandit formulation.


Generalizing Skewering Strategies across Food Items

This video summarizes our work on SPANet framework and shows demonstrations of the robot’s skewering trials generalized across food items.


Transfer depends on Acquisition: Analyzing Manipulation Strategies for Robotic Feeding


Bite Acquisition with Tactile Sensing

The video shows calibration of FingerVision and Fingertip GelSight sensors for food manipulation application. The sensitivity of the sensors increase with low gripping force. In addition, it shows a control policy using which a robot can change the range and sensitivity of these tactile sensors by controlling gripping forces.


Towards Robotic Feeding: Role of Haptics in Fork-based Food Manipulation