You can manage bookmarks using lists, please log in to your user account for this.
Media type:
E-Article;
Text
Title:
Confidence-aware pedestrian tracking using a stereo camera
Contributor:
Nguyen, U.
[Author];
Rottensteiner, F.
[Author];
Heipke, C.
[Author];
Vosselman, G.
[Author];
Oude Elberink, S.J.
[Author];
Yang, M.Y.
[Author]
imprint:
Göttingen : Copernicus GmbH, 2019
Published in:ISPRS Geospatial Week 2019 : 10-14 June 2019, Enschede, The Netherlands ; ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences ; IV-2/W5
Footnote:
Diese Datenquelle enthält auch Bestandsnachweise, die nicht zu einem Volltext führen.
Description:
Pedestrian tracking is a significant problem in autonomous driving. The majority of studies carries out tracking in the image domain, which is not sufficient for many realistic applications like path planning, collision avoidance, and autonomous navigation. In this study, we address pedestrian tracking using stereo images and tracking-by-detection. Our framework comes in three primary phases: (1) people are detected in image space by the mask R-CNN detector and their positions in 3D-space are computed using stereo information; (2) corresponding detections are assigned to each other across consecutive frames based on visual characteristics and 3D geometry; and (3) the current positions of pedestrians are corrected using their previous states using an extended Kalman filter. We use our tracking-to-confirm-detection method, in which detections are treated differently depending on their confidence metrics. To obtain a high recall value while keeping a low number of false positives. While existing methods consider all target trajectories have equal accuracy, we estimate a confidence value for each trajectory at every epoch. Thus, depending on their confidence values, the targets can have different contributions to the whole tracking system. The performance of our approach is evaluated using the Kitti benchmark dataset. It shows promising results comparable to those of other state-of-the-art methods.