Activity Modeling and Understanding
We are extending computational models to interpret and understand human activity using multi-modal data in the context of novel applications of human-robot interaction and human computer interaction. Using the latest sensor technologies, features of a user's body movements, speech, and physiological signals can be extract in real-time throughout an interaction, and used to classify the user's beliefs, desires, and intentions. Specifically, we are interested in enabling the robot to detect spatially-situated social cues, such as detecting transitions into, during, and out of an interaction. We are developing methods for the robot to provide feedback to a user during an exercise or other task-performance sessions that considers the user's physiological data, such as heart rate, respiration rate, skin temperature, and accelerometry. Our methods employ a variety of machine learning and signal processing techniques toward robust activity recognition and understanding for improved autonomous decision making by socially-situated agents in human-machine interaction contexts.
Embodied Communication and Interaction
We are exploring the role of the robot's physical embodiment in its abilities to engage, influence, and interact. We are examining the differences physical embodiment makes on social interaction, compared to non-physical but otherwise identical interactions (such as via smart phones, PDAs, and computers). We are interested in both better understanding and leveraging the growing evidence in favor of physically embodied robots over disembodied alternatives, showing that robots are particularly effective at engaging human users and shaping behavior. We are seeking to develop principled methods for controlling the robot's embodiment and expressiveness through "body language", including micro-social behaviors
Influencing Social Dynamics
We are developing a heterogeneous set of computational models for complex mechanisms such as liking, trust, and motivation drawn from social and cognitive science and known to influence behavior and improve relationships. We are interested in models that enable the robot to leverage its physical embodiment to effectively convey, train, and facilitate the human user's pursuit of recommended activities (e.g., exercise, treatment, training, and rehabilitation) moving toward the internalization of the goals of those activities and increase in user autonomy, self-efficacy, and self-determination. These efforts also involve the study of micro-social behaviors (see Embodied Communication and Interaction) as means by which to acheive high-level social influence and behavioral change.
We are interested in developing and enhancing methods that allow robots to adapt to specific users and personalize their interaction in order to provide effective, long-term human-robot interaction. Sustained interaction with people is a challenging learning problem; individuals' moods, states, and abilities are dynamic and continually changing, and other factors can be hidden, inaccessible, change asynchronously, or have latent influences on behavior. Additionally, human users typically become fatigued before providing ample training examples, act inconsistently, and require intuitive interfaces. Our work is driven by several motivations