Seminar given by Csaba Benedek on January 23, 2014 (2pm-3pm, Byron Beige)
Abstract: In this talk, we introduce a system for generating 4D video flows from large dynamic scenes by integrating two different types of data: outdoor 4D point sequences measured by a rotating LIDAR sensor, and 4D models of moving actors obtained in a 4D studio. The aim of the solution is to create spatio-temporal virtual city models containing walking pedestrians, building facades, moving and parking vehicles and other street objects. The LIDAR monitors the scene from a moving vehicle top or a fix installed position and provides a dynamic point cloud. The data processing modules include foreground modeling, object detection, multiple person tracking and on-line re-identification. Then, the system geometrically reconstructs the ground, walls and further objects of the background scene, and texture the obtained models with photos taken from the scene. Finally, we insert into the scene textured 4D models of moving pedestrians which were preliminary created in a special 4D reconstruction studio. The output is a virtual scene with avatars that follow in real time the trajectories of the pedestrians. The obtained model may result in a significantly improved visual experience for the observer compared to conventional video streams, since a reconstructed 4D scene can be viewed and analyzed from an arbitrary viewpoint, and virtually modified by the user. Further demos of the system can be found in the following url: http://web.eee.sztaki.hu/i4d/.