Human AugmentatioN via Dexterity

Revolutionizing Robot Dexterity and Empowering Human Work

Mapping Dexterous Action to Interface Elements

This project will develop a better understanding of how people express their intent in dexterous tasks, such as opening a jar of peanut butter, scooping some, and spreading it on a slice of bread, model their preferences for how task actions should be performed, and create multimodal interfaces that map high- and low-level task instructions to dexterous task primitives.

Project Lead

Bilge Mutlu

Bilge Mutlu

University of Wisconsin, Madison

Project Team

Brenna Argall

Brenna Argall

Northwestern University and Shirley Ryan AbilityLab

Julie Shah

Julie Shah

Massachusetts Institute of Technology

Graphic of woman wearing robot hands standing with items on a table in front of her.

Using a multimodal interface, a user instructs a dexterous robot system to perform a desired task based on the user’s preferences