Human AugmentatioN via Dexterity

Revolutionizing Robot Dexterity and Empowering Human Work

Vision

Our vision is a future in which robotic manipulation isintelligent, dexterous, and versatile;accessible to all people and all firms (small and large);widely used to augment hands-on work; and a strong contributor to a more resilient society and productive economy.

The HAND ERC is working toward an engineered system comprising (1) vision-and-tactile-sensor-enabled robotic end-effectors, integrable with commercially-available robot arms and humanoids, and capable of diverse dexterous behaviors, including working with human tools; (2) high-fidelity simulation (including simulation of soft skins and tactile sensors) for digital twin functions and sim-to-real development of dexterous skills; (3) an open and extensible library of robust, composable, error-correcting dexterous skills; and (4) an intuitive, low-code interface and training materials enabling fast deployment of dexterous robots by non-specialists.

Our goal is to enable highly dexterous robotic manipulation “out of the box” by 2035. A non-specialist will be able to set up a bimanual robot system and, within a day, have it robustly perform tactile-and-vision-based dexterous skills included in the skill library, such as twisting a cap on a jar or using hand tools. By the end of a week, the robot will productively perform entire dexterous tasks for the user’s particular application.

Strategy

How we are going to get there

To achieve our vision, we are pursuing four interwoven research activities: three basic research thrusts (Thrust 1: Hands, Thrust 2: Intelligent Dexterity, and Thrust 3: Human Interface) and System Integration and Testbeds. A full list of current projects in these four research activities can be reached via the button below.

Black and white graphic of robot hands manipulating a jar lid.

Thrust 1: Hands

Hand design incorporating new technologies for breakthrough performance, upgraded every two years. Soft, durable, multimodal tactile sensing skin with advanced actuators including improved tendons, electrostatic clutches, and LCE fiber bundles.

Black and white graphic of a cloud surrounded by smaller clouds with various icons representing thought and intelligence.

Thrust 2: Intelligent Dexterity

Intelligent Dexterity includes the development of a dexterous skill library; methods for training skills including behavior cloning, reinforcement learning, and sim2real; sensor-based control; and simulation for soft, tactile skin and high-dof hands for broad distribution of HAND’s results.

Black and white illustration of a human wearing VR goggles and robot gloves.

Thrust 3: Human Interface

Multimodal interfaces including haptic interfaces, virtual reality, and natural language; low-code programming reducing the cost and time of integration; and social, legal, and industrial studies on the applications and societal impact of dexterous robots. 

System Integration and Testbeds

Our research will be embodied and tested in six synergistic testbeds: the research testbed, the Dexterity Nexus (DexNex), and five application testbeds, TB1-TB5. The application testbeds span a wide range of applications to (1) leverage existing partnerships, (2) target high-impact domains, and (3) stretch the requirements of our dexterous hands to promote the development of truly versatile end-effectors.

DexNex Research Testbed

DexNex, based at Northwestern, integrates the latest advances from across HAND, validates technology, and prototypes new applications with ecosystem partners.

Person wearing virtual goggles operating robot arms that are soldering wires.
Map of mid to eastern half of the United States denoting five locations for the testbeds.

Application Testbeds

TB1-Mfg High-mix manufacturing, Berkshire Innovation Center, Pittsfield, MA

TB2-Assembly Assembly/disassembly, Carnegie Mellon Manufacturing Futures Institute, Pittsburgh, PA

TB3-Assist Assistance and caregiving, Shirley Ryan AbilityLab, Chicago, IL

TB4-HiCons High-consequence material handling, Texas A&M University, College Station, TX

TB5-DexEd Dexterity education testbed, Florida A&M University, Tallahassee, FL

Research Leadership

Kevin Lynch

Kevin Lynch

HAND Center Research Director

Northwestern University

Carmel Majidi

Carmel Majidi

Lead, Thrust 1: Hands

Carnegie Mellon University

Brenna Argall

Brenna Argall

Lead, Thrust 2: Intelligent Dexterity

Northwestern University

Julie Shah

Julie Shah

Lead, Thrust 3: Human Interface

Massachusetts Institute of Technology

Research Team

Rob Ambrose

Rob Ambrose

Texas A&M University

Ben Armstrong

Ben Armstrong

Massachusetts Institute of Technology

Shonda Bernadin

Shonda Bernadin

Florida A&M University

Jian Cao

Jian Cao

Northwestern University

Jonathan Clark

Jonathan Clark

Co-lead, TB5-DexEd

Florida A&M University

J. Edward Colgate

J. Edward Colgate

Northwestern University

Tarik Dickens

Tarik Dickens

Florida A&M University

Matthew Elwin

Matthew Elwin

Northwestern University

Gary Fedder

Gary Fedder

Carnegie Mellon University

Rebecca Friesen

Rebecca Friesen

Texas A&M University

Elizabeth Gerber

Elizabeth Gerber

Northwestern University

M. Cynthia Hipwell

M. Cynthia Hipwell

Texas A&M University

Roberta Klatzky

Roberta Klatzky

Carnegie Mellon University

Brandon Krick

Brandon Krick

Carnegie Mellon University

Oliver Kroemer

Oliver Kroemer

Carnegie Mellon University

Carl Moore

Carl Moore

Florida A&M University

Todd Murphey

Todd Murphey

Northwestern University

Bilge Mutlu

Bilge Mutlu

University of Wisconsin, Madison

Melisa Orta-Martinez

Melisa Orta-Martinez

Carnegie Mellon University

Nancy Pollard

Nancy Pollard

Carnegie Mellon University

Vincent Sitzmann

Vincent Sitzmann

Massachusetts Institute of Technology

Russ Tedrake

Russ Tedrake

Massachusetts Institute of Technology

Dan Traficonte

Dan Traficonte

Syracuse University

Ryan Truby

Ryan Truby

Northwestern University

Taylor Ware

Taylor Ware

Texas A&M University