Human AugmentatioN via Dexterity

Revolutionizing Robot Dexterity and Empowering Human Work

Essential Haptic Cues for Dexterous Tasks

This project investigates high-resolution skin deformation cues essential for intuitive dexterous task performance, guiding the development of multimodal haptic interfaces that convey contact and slip cues with the necessary temporal and spatial resolution. The outcomes will enhance robot teleoperation and supervision, inform sensor design for robotic hands, and advance haptic display technology—ultimately improving robotic dexterity and usability in real-world applications.

Project Leads

Rebecca Friesen

Rebecca Friesen

Texas A&M University

Melisa Orta-Martinez

Melisa Orta-Martinez

Carnegie Mellon University

Project Team

Roberta Klatzky

Roberta Klatzky

Carnegie Mellon University

Two illustrations showing human hands twisting the top of a jar and fingertip sensing.

Capture and render of essential haptic cues for manipulation of a jar lid.