Grzegorz Cielniak, Mihajlo Miladinovic, Daniel Hammarin, Linus Göransson, Achim Lilienthal and Tom Duckett
Appearance-based Tracking of Persons with an Omnidirectional Vision Sensor
Proceedings of the IEEE Workshop on Omnidirectional Vision (OMNIVIS 2003)
Abstract
This paper addresses the problem of tracking a moving person
with a single, omnidirectional camera. An appearance-based
tracking system is described which uses a self-acquired appearance
model and a Kalman filter to estimate the position of the person.
Features corresponding to ``depth cues'' are first extracted from
the panoramic images, then an artificial neural network is trained
to estimate the distance of
the person from the camera.
The estimates
are combined using a discrete Kalman filter to track the position
of the person over time. The ground truth information required for
training the neural network and the experimental analysis was
obtained from another vision system, which uses multiple webcams
and triangulation to calculate the true position of the person.
Experimental results show that the tracking system is accurate and
reliable, and that its performance can be further improved by
learning multiple, person-specific appearance models.
Download
Paper: [pdf]
Bibtex
@INPROCEEDINGS{Cielniak:2003b,
AUTHOR = "Grzegorz Cielniak and Mihajlo Miladinovic and Daniel Hammarin and Linus Göransson and Achim Lilienthal and Tom Duckett",
TITLE = "Appearance-based Tracking of Persons with an Omnidirectional Vision Sensor",
BOOKTITLE = "Proceedings of the IEEE Workshop on Omnidirectional Vision (OMNIVIS 2003)",
YEAR = "2003",
ADDRESS = "Madison, USA",
DATE = "June, 21",
}