Projects     
Publications     
Sponsors     
Profile     
People     
Facilities     
Meetings     
Tours     
Awards     
Media     
Videos     
Resources     
Fun Stuff     

Research >> Projects

Primitives-Based Imitation

We are pursuing a research program focused on developing a model of learning by imitation. As one of the most powerful yet poorly understood forms of learning in nature, imitation presents an important research problem in AI and machine learning, as well as in the behavior and neural sciences.

Imitation involves the interaction of perception, memory, and motor control, subsystems that typically utilize very different representations, and must interact to produce and learn novel behavior patterns. Gaining insight into the mechanisms of imitation is compelling from the standpoint of AI and behavioral sciences, since the propensity for it appears to be innate and the mechanism is phylogenetically old, but its true and complete form (capable of acquiring arbitrary novel skill from observation) is very rare in nature. From the practical standpoint, imitation even in its simple forms is a faster and more efficient form of acquiring new behaviors than its traditional classical conditioning and reinforcement learning counterparts; in humans it is critical during development and remains an important aspect of social interaction and adaptation throughout life.

Our work on imitation is driven by two main motivations:

  • to gain insight into the complex mechanisms underlying imitation
  • to make human-robot interaction, including robot control and learning, more direct and natural, through the use of imitation

    We are addressing both by developing a model of imitation, and evaluating it on different test-beds and learning tasks. In the development of the model, we are constrained by cognitive science and neuroscience data. We perform psychophysical experiments in order to gain further insight into human perceptual and motor behavior in imitation. We also base our model on the philosophy that features a strong link between perception and action (such as that found in mirror neurons), the use of motor primitives for movement (such as spinal vector fields and central pattern generators, as well as cortical structures) that can also facilitate movement recognition, and the use of imitative abilities as a basis for communication and higher-level cognition.

    We have conducted several pychophysical experiments to study movement perception and reconstruction in imitation. Using those data, and current theories from neuroscience and cognitive science, we are developing a model of learning by imitation, based on the notions of perceptual-motor primitives (related to basis behaviors), mirror neurons, and classification-based learning. We are experimenting with innate primitives, as well as using bottom-up methods for automatically extracting primitives from the movement data. We evaluate all models on complex test-beds: a 20 DOF dynamical simulation of a humanoid torso, 2 humanoid avatars, and a robot dog, all receiving visual input of human movement collected with visual motion capture, 3D Cartesian space markers (FastTrak,Opto Trak), or joint angles for the full body (Sarcos SenSuit).

    Our immediate research goals are:

    1) To keep refining our bilogically-inspired model of learning by imitation, using neuroscience evidence of phylogenetically old structures such as mirror neurons and motor primitives as philosophical keystones and constraints.

    2) To explore different types of primitives, focusing on oscillatory, discrete, and postural.

    3) To gather and use a large corpus of human movement data for automatically deriving primitives.

    4) To validate the model on different implementations for various complex motor tasks, including dance patterns and the manipulation of objects, and test it on and across different test-beds.

    5) To extend the learning system to include hierarchical learning of arbitrarily complex tasks.

    6) To develop methods for evaluation of imitation performance.

    This (somewhat old) poster summarizes the project goals.


    Projects
  • Psychophysical experiments
  • Humanoid motor control.
  • An implementation of the model on a humanoid
  • Imitation with the Sony dog
  • A metric for evaluating imitation
  • Automated derivation of primitives
  • Representations for primitives
  • Hierarchical model for imitation
  • Human-robot interaction through imitation

  • Demonstrations of results
    Experimental Test-Beds for Model Validation

    People:

    Maja Mataric (PI)
    Evan Drumwright
    Chad Jenkins
    Amit Ramesh

    Alumni:

    Monica Nicolescu
    Aude Billard
    Ajo Fod
    Stefan Weber


    Publications

    Support

    This work is supported by DARPA Grant DABT63-99-1-0015 under the Mobile Autonomous Robot Software (MARS and MARS-2020) programs. Past projects have been sponsored by the National Science Foundation under Grant No. 9896322.



    Home | Email: agents (at) robotics.usc.edu | Last Update 08/05/2007