Integrating tracking with fine object segmentation



Brief description

We present a novel method for on-line, joint object tracking and segmentation in a monocular video captured by a possibly moving camera. Our goal is to integrate tracking and fine segmentation of a single, previously unseen, potentially non-rigid object of unconstrained appearance, given its segmentation in the first frame of an image sequence as the only prior information. To this end, we tightly couple an existing kernel-based object tracking method with Random Walker-based image segmentation. Bayesian inference mediates between tracking and segmentation, enabling effective data fusion of pixel-wise spatial and color visual cues. The fine segmentation of an object at a certain frame provides tracking with reliable initialization for the next frame, closing the loop between the two building blocks of the proposed framework. The effectiveness of the proposed methodology is evaluated experimentally by comparing it to a large collection of state of the art tracking and video-based object segmentation methods on the basis of a data set consisting of several challenging image sequences for which ground truth data is available.

The outline of the method

Full-size image (54 K)


The outline of the method with sample intermediate results

Full-size image (67 K)



Sample results

A video with tracking experiments.


Contributors


Relevant publications

  • K. Papoutsakis, A.A. Argyros, “Integrating tracking with fine object segmentation”, Image and Vision Computing, Volume 31, Issue 10, pp. 771-785, Oct. 2013,
  • K. Papoutsakis, A.A. Argyros, “Object tracking and segmentation in a closed loop”, in Proceedings of the International Symposium on Visual Computing, ISVC’2010, Advances in Visual Computing, Lecture Notes in Computer Science, Volume 6453, pp. 405-416, Las Vegas, USA, Nov 29-Dec 1, 2010.

The electronic versions of the above publications can be downloaded from my publications page.