We are extending computational models to interpret and understand human activity using multi-modal data in the context of novel applications of human-robot interaction and human computer interaction. Using the latest sensor technologies, features of a user's body movements, speech, and physiological signals can be extract in real-time throughout an interaction, and used to classify the user's beliefs, desires, and intentions. Specifically, we are interested in enabling the robot to detect spatially-situated social cues, such as detecting transitions into, during, and out of an interaction. We are developing methods for the robot to provide feedback to a user during an exercise or other task-performance sessions that considers the user's physiological data, such as heart rate, respiration rate, skin temperature, and accelerometry. Our methods employ a variety of machine learning and signal processing techniques toward robust activity recognition and understanding for improved autonomous decision making by socially-situated agents in human-machine interaction contexts.
Data-Driven Interaction Methods for Socially Assistive Robotics: Validation With Children With Autism Spectrum DisordersHuman Activity Monitoring for Socially Assistive Interaction with Older AdultsUsing Emotion Modeling for User State IdentificationHuman Perception of Synthetic Character EmotionAutomated Detection and Classification of Positive vs. Negative Robot Interactions with Children with AutismPeople-Aware Navigation For Goal-Oriented Behavior Involving a Human PartnerModeling using Overhead CameraNatural Methods for Human-Robot InteractionLearning by Demonstration: A Human-Inspired ApproachLearning by Demonstration: A Human-Inspired ApproachPerceptually Motivated Symbol GenerationUsing Hands-Off Assistive Robotics for Educational Intervention for Children with Attention Deficit/Hyperactivity Disorder (AD/HD)Evaluating Arm ImitationMetric for the Evaluation of ImitationA Hierarchical model for imitationPsychophysics of ImitationAction-embedded Framework for Human-Robot Interation and Learning from DemonstrationCollecting statistics from collective human motionIndividual and collective spatial representations in crowds and other social behaviorsHuman Activity Modeling using Laser Range-FindersOn-Line Modeling of Robot Interaction DynamicsLaser-Based People & Robot TrackingModeling Multi-Robot and Human ActivitiesParametric Motor Primitives for Facilitating Humanoid Robot Perception and ControlSkill Learning by Primitives-Based Demonstration and ImitationDerivation of Motion Primitives with Spatio-Temporal Dimension ReductionVolume-Based Human Motion CapturePerceptual-Motor PrimitivesExtracting Motion Primitives from Movement DataLearning of Perceptuo-Motor PrimitivesPrimitive for Motor Control in Martial Arts