Brief description

We present a computer vision system that supports non-instrumented, location-based interaction of multiple users with digital representations of large-scale artifacts. The proposed system is based on a camera network that observes multiple humans in front of a very large display. The acquired views are used to volumetrically reconstruct and track the humans robustly and in real time, even in crowded scenes and challenging human configurations. Given an accurate monitoring of humans in space and time, a dynamic and personalized textual/graphical annotation of the display can be achieved based on the location and the walk-through trajectory of each visitor. The proposed system has been successfully deployed in an archaeological museum, offering its visitors the capability to interact with and explore a digital representation of an ancient wall painting. This installation permits an extensive evaluation of the proposed system in terms of tracking robustness, computational performance and usability. Furthermore, it proves that computer vision technology can be effectively used to support non-instrumented interaction of humans with their environments in realistic settings.

Sample results

Interactive exhibition: "Macedonia: from fragments to pixels": Visit the web site of the exhibition Macedonia: from fragments to pixels featuring several interactive exhibits (including Macrographia, an interactive exhibit developed based on this work) developed in the context of the Ambient Intelligence Programme of FORTH-ICS.


Contributors

  • Xenophon Zabulis, Dimitris Grammenos, Thomas Sarmis, Konstantinos Tzevanidis, Pashalis Padeleris, Panagiotis Koutlemanis, Antonis A. Argyros, Constantine Stephanidis.
  • This work has been supported by the FORTH-ICS internal RTD Programme “Ambient Intelligence and Smart Environments” and the IST-FP7-IP-215821 GRASP.

Relevant publications

  • X. Zabulis, D. Grammenos, T. Sarmis, K. Tzevanidis, P. Padeleris, P. Koutlemanis, A.A. Argyros, “Multicamera human detection and tracking supporting natural interaction with large scale displays”, in Machine Vision Applications journal, published online Feb 2012.
  • X. Zabulis, T. Sarmis, K. Tzevanidis, P. Koutlemanis, D. Grammenos, A.A. Argyros, “A platform for monitoring aspects of human presence in real-time”, in Proceedings of the International Symposium on Visual Computing, ISVC’2010, Advances in Visual Computing, Lecture Notes in Computer Science, vol. 6454, pp. 584-595, Las Vegas, USA, Nov. 29-Dec. 1, 2010.
  • X. Zabulis, D. Grammenos, T. Sarmis, K. Tzevanidis, A.A. Argyros, “Exploration of large-scale museum artifacts through non-instrumented, location-based, multi-user interaction” in Proceedings of the 11th VAST International Symposium on Virtual Reality, Archaeology and Cultural Heritage, VAST’2010, Palais du Louvre, Paris, France, Sep. 21-24, 2010.
  • K. Tzevanidis, X. Zabulis, T. Sarmis, P. Koutlemanis, N. Kyriazis, A.A. Argyros, “From multiple views to textured 3D meshes: a GPU-powered approach”, in Proceedings of the Computer Vision on GPUs Workshop, CVGPU’2010, In conjunction with ECCV’2010, Heraklion, Crete, Greece, Sep. 10, 2010.
  • X. Zabulis, T. Sarmis, D. Grammenos, A.A. Argyros, “A multicamera vision system supporting the development of wide-area exertainment applications”, in Proceedings of the IAPR Conference on Machine Vision and Applications (MVA’09), pp. 269-272, Hiyoshi Campus, Keio University, Japan, May 20-22, 2009.

The electronic versions of the above publications can be downloaded from my publications page.

<