Robotics Research Lab
CRES
USC Computer Science
USC Engineering
USC
/ Research / Projects / Skill Learning by Primitives-Based Demonstration and Imitation

Overview Team & Contact Info Prior Work Projects
Presentations & Reports Publications Related Links Funding Details

Overview

The Interaction Lab contributes to the MARS-2020 project as a subcontractor to NASA. Our work focuses on the generation and use of primitive motions for humanoid motion control. In the scope of the MARS-2020 project, our efforts are directed to the Robonaut humanoid robot.


Team & Contact Info Robonaut Image
Principal Investigator (PI):
Postdocs:
Graduate Students:
Other Participants:
  • Nathan Miller

Mail either the PI or Team Leader.


Relevant Prior Work
Projects

Real Time Human Motion Tracker

The image shows a 3DOF rotational sensor being develloped at the lab as a solution to (wireless) low cost motion capture device capable of logging high DOF human motion in an unstructured environment. Sensors are based on an Atmel 8 bit microcontroller w/10 bit ADC, 8 Mhz, including filtered 300 deg/sec Gyroscopes and 2-G Accelerometers.

Humanoid Motion Planning

The image shows the type of roadmap graph we are using for applications such as motion planning or collision-free IK. We are currently investigating the use of demonstrated data motion in order to build efficient roadmaps. Nodes and edges of this 17-dimensional roadmap are graphically represented in the image with two colors, showing the position of the right wrist (red) and left wrist (green).

Example videos of obtained collision-free two arm motions with Robonaut:
bothup (4.1MB), updown (5.8MB), and around (3.2MB).
Also Learning Reaching Primitives from Demonstrated Motion
Hierarchical Model for Learning by Imitation

The goals of this project are to develop models of motor learning for articulated agents through imitation. We are interested on methods suited to real time interaction, and able to perform learning online.

Example videos:

  • Parametric primitives: Demonstration [left] and imitation [right] of "figure-8" movement [mpg, 982KB].
  • Sequence learning: Demonstration [left] and imitation [right] of a movement sequence [mpg, 3MB].
See also, Metric for the Evaluation of Imitation

Parametric Motor Primitives and Bayesian Motion Classification

We have developed a movement classifier that aims to improve communication between humans and humanoids via motion. The classifier uses a Bayesian classifier to categorize joint-angle data into a motor primitive. Motor primitives are parametric and kinematic models of motion.

Automated Derivation of Motion Primitives

We employ spatio-temporal dimension reduction to analyze motion data. The image shows the path of 5 consecutive Robonaut grasp motions embedded in a 3D space constructed with the use of statio-temporal isomap over the full joint angle space of the original captured data. The structure of the motion was succesfully recovered, even though each grasp was performed in a different cartesian location in the workspace (work in collaboration with Alan Peters).
Motion primitives are used to construct vocabularies of behaviors that can be used to various tasks.

Here are some example videos:

Presentations & Reports
Publications
Related Links

Mars2020 links:

Funding Details

This work is supported by the DARPA MARS 2020 Program project "Acquisition of Autonomous Behaviors by Robotic Assistants", via the NASA subcontract grant NAG9-1444 "Skill Learning by Primitives-Based Demonstration & Imitation".

See also: a complete list of sponsors of the lab.