Paper published in IEEE Transactions on PAMI

The paper Variotional Bayesian Inference for Audio-Visual Tracking of Multiple Speakers has been published in the IEEE Transactions on Pattern Analysis and Machine Intelligence (journal with one of the highest impact score in the category computational intelligence). This work is part of the Ph.D. thesis of Yutong Ban, now with…

Continue reading

H2020 Project SPRING awarded!

The Perception team is happy to announce that a new project has been awarded by the European Union under the H2020-ICT program. The main objective of SPRING (Socially Pertinent Robots in Gerontological Healthcare) is the development of socially assistive robots with the capacity of performing multimodal multiple-person interaction and open-domain dialogue….

Continue reading

Sparse representation, dictionary learning, and deep neural networks: their connections and new algorithms

Seminar  by Mostafa Sadeghi, Sharif University of Technology, Tehran Tuesday 19 June 2018, 14:30 – 15:30, room F107 INRIA Montbonnot Saint-Martin Abstract. Over the last decade, sparse representation, dictionary learning, and deep artificial neural networks have dramatically impacted on the signal processing and machine learning areas by yielding state-of-the-art results…

Continue reading

Deep Regression Models and Computer Vision Applications for Multiperson Human-Robot Interaction

PhD defense by Stéphane Lathuilière Tuesday 22nd May 2018, 11:00, Grand Amphithéatre INRIA Grenoble Rhône-Alpes, Montbonnot Saint-Martin Abstract: In order to interact with humans, robots need to perform basic perception tasks such as face detection, human pose estimation or speech recognition. However, in order have a natural interaction with humans,…

Continue reading

Plane Extraction from Depth-Data

The following journal paper has just been published: Richard Marriott, Alexander Pashevich, and Radu Horaud. Plane Extraction from Depth Data Using a Gaussian Mixture Regression Model. Pattern Recognition Letters. vol. 110, pages 44-50, 2018. The paper is free for download from our publication page or directly from Elsevier.

Continue reading