This project is focused on developing and evaluating human-machine interaction techniques, especially socially assistive robotics (SAR) techniques, that influence the user to engage in wellness-promoting behaviors involving physical, cognitive, and social activity. Our work for this project focuses on two types of wellness-promoting human-machine interactions: 1) exercise sessions and 2) socializing sessions. In the former, the SAR provides exercise monitoring, coaching, and motivation, while in the latter, the SAR provides social contact and friendly reminders and encouragement. The two types of interaction share the common underlying goal of influencing behavior. We take a twofold approach to reaching that goal: 1) we use steering to learn how to affect the user’s behavior in the short term during an interaction; and 2) we use motivation to affect the user’s longer-term behavior over the course of an interaction and help to maintain their engagement in the interaction. This work has developed and validated socially assistive techniques for influencing behavior by steering interaction dynamics and by encouraging performance. The embodiment of the SAR is leveraged to maximize engagement and compliance; therefore the work for this project also focuses on natural, persuasive and engaging embodied communication.
This research focuses on the development, evaluation, and user testing of a Socially Assistive Robot (SAR) exercise coach designed to motivate and engage older adults in a seated aerobic exercise task. Our SAR system approach incorporates insights from psychology research into intrinsic motivation and contributes clear design principles for SAR-based therapeutic interventions. Through our user studies, we have examined several different aspects of wellness-promoting socially assistive systems, including a comparison of an embodied versus a non-embodied (computer-based) system, the role of praise and relational discourse, and the effect of varying user autonomy.
In this project, we extended our previous work with the elderly performing "chair exercises" guided by a socially assistive robot. The exercise scenario utilized a socially assistive robot to instruct, evaluate, and encourage users to perform simple arm gesture exercises. The scenario was one-on-one, allowing the robot to focus its attention on the single user in order to provide timely, accurate feedback, and to maximize the effectiveness of the exercise session for the user. In the set up, the user was seated in a chair in front of the robot; the user and robot faced each other. The developed probabilistic activity monitoring systems affords the robot the ability to track the user’s arm movements; the use of the Kinect sensor (as opposed to a monocular camera) is an extension of our previous work, in that the robot and arm motion is no longer restricted to the sides of the body (i.e., non-planar)Automated Proxemic Behavior Recognition and Production
This research investigates proxemics in human-robot interaction (HRI). Proxemics is the study of the dynamic process by which people position themselves in face-to-face social encounters. This process is governed by sociocultural norms that, in effect, determine the overall sensory experience of each interacting participant. To facilitate situated and mobile HRI, this research seeks to develop functional and socially appropriate probabilistic computational models of proxemics for the purposes of both autonomous proxemic behavior recognition (of one or many people) and autonomous proxemic behavior production (by a sociable robot).Active NAO! Combating Childhood Obesity with Robot Companions
In this project we are developing methods for creating a cohesive character for a Socially Assistive Robot (SAR) exercise buddy with the goal of motivating increased exercise effort. We are using a SAR in a teammate role on a peer level interaction with a user engaged in physical exercise. We are currently developing the SAR exercise buddy system for circuit training with obese and overweight youth age 11-14.Spatial Language-Based Human-Robot Interaction
This work presents a novel methodology that allows service robots to interpret and follow spatial language instructions, with and without user-specified natural language constraints and/or unvoiced pragmatic constraints. This work contributes a general computational framework for the representation of dynamic spatial relations, with both local and global properties. The methodology also contributes a probabilistic approach in the inference of instruction semantics; a general approach for interpreting object pick-and-place tasks; and a novel probabilistic algorithm for the automatic extraction of contextually and semantically valid instruction sequences from unconstrained spatial language discourse, including those containing anaphoric reference expressions.
|Search for a Publication|
This work is supported by National Science Foundation (NSF) grant titled "Socially Assistive Human-Machine Interaction for Improved Compliance and Health Outcomes" - Award number: IIS-1117279.
Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.