Skip to yearly menu bar Skip to main content


Poster

MVTrajecter: Multi-View Pedestrian Tracking with Trajectory Motion Cost and Trajectory Appearance Cost

Taiga Yamane · Ryo Masumura · Satoshi Suzuki · Shota Orihashi


Abstract:

Multi-View Pedestrian Tracking (MVPT) aims to track pedestrians in the form of a bird's eye view occupancy map from multi-view videos.End-to-end methods that detect and associate pedestrians within one model have shown great progress in MVPT.The motion and appearance information of pedestrians is important for the association, but previous end-to-end MVPT methods rely only on the current and its single adjacent past timestamp, discarding the past trajectories before that.This paper proposes a novel end-to-end MVPT method called Multi-View Trajectory Tracker (MVTrajecter) that utilizes information from multiple timestamps in past trajectories for robust association.MVTrajecter introduces trajectory motion cost and trajectory appearance cost to effectively incorporate motion and appearance information, respectively.These costs calculate which pedestrians at the current and each past timestamp are likely identical based on the information between those timestamps.Even if a current pedestrian could be associated with a false pedestrian at some past timestamp, these costs enable the model to associate that current pedestrian with the correct past trajectory based on other past timestamps.In addition, MVTrajecter effectively captures the relationships between multiple timestamps leveraging the attention mechanism.Extensive experiments demonstrate the effectiveness of each component in MVTrajecter and show that it outperforms the previous state-of-the-art methods.

Live content is unavailable. Log in and register to view live content