|Introduction||Kalman Tracker||Particle Tracker||Publications||Support||Contact|
The goal of this project is to develop a people/robot tracking system that uses multiple planar laser range finders (SICK) and tracks multiple moving targets. We experimented using two different approaches for target state estimation: one with Kalman filters and the other with particle filters.
Raw laser range readings are first preprocessed to extract foreground points using a background model (where means and standard deviations of background ranges are maintained). The foreground points that appear to form continuous surfaces are aggregated into "blobs" which are matched to one or more blobs from the previous laser scan. A blob is associated to at least one object, which is tracked by a Kalman filter based object tracker. All matched blobs inherit the object tracker from their parent blobs to coninue estimation of an object's trajectory. New object trackers are instantiated for blobs that are not associated to any blobs from the previous scans. These object trackers persist for a certain duration even when no blobs are associated to it, enabling it to keep tracking temporary occluded objects. This tracker is implemented and runs in real-time.
Here is a movie showing the output of the tracker.
Raw laser range readings are preprocessed to extract foreground points using an adaptive background model. The background model consists of the means and standard deviations of background range readings at each laser angle. Foreground points are grouped into clusters which are possible boundaries of moving targets in the environment.
A particle filter is used to track each moving target. New foreground clusters are associated to the nearest foreground cluster observed at the last time instance. If there is a match, the old particle filter is updated using the new foreground cluster as the observations. Otherwise, a new particle filter is instantiated to track this target. Filters that are not associated with any observations for an extended period of time will be removed.
Here is a movie showing the output of the tracker when 2 people are walking.
This work is supported by DARPA Grant DABT63-99-1-0015 under the Mobile Autonomous Robot Software (MARS) program, and in part by the ONR MURI Grant (with UC Berkeley, Stanford, and Caltech).