Robotics is currently at the forefront of technologies with recognized potential for impacting quality of human life. In response to the large need for personalized one-on-one care for the growing populations of elderly individuals and those with special cognitive and social needs throughout life, great strides must be made in the domain of human-robot interaction (HRI) in order to bring robotics into such application domains in human everyday life. This interdisciplinary project identifies a specific set of HRI research questions in socially assistive robotics, the study of robotic systems capable of providing help through social rather than physical interaction. A novel assistive robot control architecture is developed, based on multi-modal perception, embodied expression and communication, and on-line user modeling, implemented in three different types of real-world socially assistive systems.
Our work involves creating online algorithms and models that enable a socially assistive robot to choose behaviors that are likely to result in desirable behaviors from children with autism spectrum disorders (ASDs). Children with ASD exhibit differences in social communication, and interventions require intensive professional time and can be expensive. Interactive computer technology can elicit social interaction in children with ASD in a consistent and affordable manner. Through our collaborative effort with clinical psychologists, our work has concentrated on developing tools that can be used by clinicians to help engage and educate children with special needs.
As part of this grant, we have investigated the roles of embodiment, situatedness, and mobility in socially assistive robotics, and their respective impacts in interactions with children with ASD. If an autonomous robot is to effectively interact with a child, it must be aware of the child's behavior, and produce timely, contingent, and socially appropriate behavior in response. If a robot is to be an effective social agent, its actions, including those as basic as relating to interpersonal distance, must be appropriate for the given social situation. This becomes a greater challenge in playful and unstructured interactions, such as those involving children. To achieve the goals of this project, we have developed and implemented socially assistive robotics algorithms and social frameworks that result in a positive interaction for the child. We have developed new methods for performing human centered signal processing and contributed new possibilities for behavioral informatics. We continue to develop utilities and drivers for controlling upper-torso humanoid robots, for tracking people and robots from an overhead camera, and for social perception by a robot platform.
We have developed models that help capture affect and human behavior. We have also made algorithmic contributions to topics such as emotion recognition and interaction modeling. Our work on multimodal signal processing has helped fuse relevant features from language, speech, and gestures, to form better performing learning schemes. Research in socially assistive robotics is interdisciplinary. Work that explores how robots can be used to augment current care for groups with need, including the supported research, has a close association with researchers in psychology, anthropology, and clinical pediatrics. Moreover, by applying methods for social science and medical research, we have to apply the rigorous standards of the primary discipline as well as any allied fields to which this research may apply, in order to demonstrate their effectiveness.
||This work is supported by the US National Science Foundation under their Human-Centered Computing (HCC) Program (IIS-0803565). Additional resources are provided through Undergraduate Mentoring through the Research Experience for Undergraduates (REU) program. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.