CSCI 584
Control and Learning in Mobile Robots
and Multi-Robot Systems

Project Topic Recommentations
The following are the topics you should use as the basis for your project proposal. You may adapt/combine these ideas in your proposal, creativity within the realm of what is doable in the course of the semester is encouraged.
Intelligent assistive device:
Developing and integrating a weight sensor for an assistive robot with which it can infer task progress (e.g., progress stuffing envelopes from the weight of the receptical containing completed ones.) Involves some limited hardware work, writing a Player driver, and testing on the Pioneer robot.
Imitation coach:
Extensions of existing arm movement evaluation software to develop an interactive imitation training program. Involves graphics/animation rendering of movement, use of motion capture equipment, and possibly use of the humanoid torso robot or Bandit simulation.
Vision-based face tracking:
Adapt Intel OpenCV or other existing face tracking software to work with humanoid torso robot. Involves development of a tracking and filtering code, we expect a Player driver as the interface for external users with possible demonstration on a physical robot.
Sound-based people tracking:
Interface the microphone array to use sound localization to detect and track people's voices. Involves some device interfacing, Player device driver writing, and testing.
Speech Recognition in HRI:
One of the more popular speech recognition packages for HRI is sphinx2-0.4. A number of important questions need to be experimentally studied and clearly tabulated. For example, comparing recognition quality under the following dimensions:
  1. software parameters and controls (internal sphinx arguments, size of vocabulary, volumes, etc.),
  2. environmental circumstances (background noise from crowds, sonars, etc.),
  3. different human subjects (w/ different accents).
Real-time stereo vision:
Implement a stereo-vision software library for a humanoid robot and demonstrate recognition on a stereo pair for a limited task.
Humanoid Gestural Control:
Provide an interface that permits high-level gestural control for a mobile humanoid robot.
Mobile Greeter Robot:
Implement a mobile greeter robot that uses its spatial awareness to present context-specific information, refer to various environmental features (possibly other people sitting at predetermined desks) and direct a number of people through an environment with an a priori map. (And/or you may focus on the study of trade-offs on particular design variables, corralling behaviors, etc.)
Facial Expression Control:
Currently Gazebo (CVS version) has a model of the bandit humanoid. It has no control of the facial features (the real robot has 5 DoF, two for eyebrows, three for lips). Your project would involve adding support to the simulator in a way that is consistent with the existing joint control.