I have always been interested in the intersection of technology and human lives. While my early research was on the biomedical engineering side of the subject, my more recent research has been in robotics, specifically in human-robot interaction. I have done research in human engagement in human-robot interaction, however my primary interest is in the use of social robots as assistive devices.
As an undergraduate I worked at Yale in the Social Robotics Lab under Prof. Brian Scassellati. One project had participants playing rock-paper-scissors with Nico, an upper-torso humanoid robot. Nico was designed and built in the Yale Social Robotics Lab, and its proportions mimic those of an 18-month old child. This project resulted in a paper I presented at HRI 2010, and which was nominated for a Best Paper award (publications).
I have also done some work with Pleo, a baby dinosaur robot made by Ugobe. I assisted Elizabeth Kim and Dan Leyzberg with a project examining prosody in autistic children who were engaged in an encouragement task with the robot. This project resulted in a presented abstract at the International Meeting for Autism Research. See my publications for a citation.
Pleo image from Ugobe.
@ USC (summer 2009)
I worked at USC in the Interaction Lab under Professor Maja Mataric. I was involved in a project to develop robotic systems for the treatment and diagnosis of autism in children. Using Willow Garage’s Robot Operating System, I developed a system to do filtered tracking of the robot from an overhead camera. I also modified this system to be used for tracking of objects moving unpredictably in two dimensions, which we expected to apply to tracking of faces in the robot’s eye cameras and to tracking the human subjects from the overhead cameras. See my project page for more details.
@ Heidelberg (summer 2008)
My primary project at the German Cancer Research Institute (DKFZ) was in the development of a 4D visualization system for esophageal cancer surgery. I developed a tool for use in testing the system in an in-vivo porcine model. I also worked on the Medical Imaging Interaction Toolkit (MITK), improving the graphical user interface. Finally, I assisted with an experiment with the daVinci robotic surgical system, which resulted in a paper at the Medical Image Processing Conference. See my publications page for the citation.
@ Penn State (summer 2007)
At Penn State I worked on data processing for a project in the Kinesiology Lab under Professor Neil Sharkey investigating the difference between bone and soft tissue movement in the human foot. I also had the opportunity to observe experiments being performed on cadaver feet.