Projects

Current projects (2020-):

Completed projects:

  • Marie-Curie Research Training Network VISIONTRAIN (2005-2009)
  • ICT-FP6 project POP (2006-2009)
  • ICT-FP7 project HUMAVIPS (2010-2013)
  • Reconstruction with depth and color cameras for 3D stereoscopic consumer displays. Project funded by Samsung Electronics (2010-2013).
  • ANR-BLANC MIXCAM project (2014-2016)
  • ICT-FP7 project EARS (2014-2017)
  • Multi-modal speaker localization and tracking. Project funded by Samsung Electronics (2016-2017)
  • ERC Advanced Grant VHIA (2014-2019)
  • ERC Proof of Concept VHIALab (2018-2019)

Academic partners:

  • The Czech Technical University in Prague (2005-2013 and 2020-2023)
  • Herriot-Watt University, Edinburgh, UK (2020-2023)
  • Hôpital Broca, Paris, France (2020-2023)
  • University of Sheffield (2006-2009)
  • University of Coimbra (2006-2009)
  • Bielefeld University (2010-2013)
  • IDIAP, Martigny (2010-2013)
  • The Technion (2005-2015)
  • Queen Mary University London (2012-2016)
  • Bar Ilan University (2013-)
  • University of Trento (2014-)
  • University of Cordoba (2014-)
  • University of Patras (2013-2016)
  • Ben Gourion University (2014-2016)
  • Imperial College (2014-)
  • Erlangen-Nuremberg University (2014-2016)
  • Humboldt University (2014-2016)
  • Polytechnic University of Bucharest (2014)

Industrial partners:

  • ERM Automatismes, Carpentras, France (2020-2023)
  • PAL Robotics, Barcelona, Spain (2020-2023)
  • Samsung Digital Media and Communications R&D Center, Seoul, Korea (2016-2017)
  • Samsung Advanced Institute of Technology, Seoul, Korea (2010-2013)
  • 4D View Solutions, Grenoble, France (2007-)
  • Aldebaran Robotics, Paris, France (2010-2013)
  • SoftBank Robotics Europe (2014-2016)
What do you want to do ?

New mail

What do you want to do ?

New mail

What do you want to do ?

New mail

What do you want to do ?

New mail

HUMAVIPS

Humanoids with Auditory and Visual Abilities In Populated Spaces HUMAVIPS was a three-year European project (1 February 2010 – 31 January 2013) Project website: http://humavips.inrialpes.fr/ Humanoids expected to collaborate with people should be able to interact with them in the most natural way. This involves significant perceptual and interactive skills, operating in a coordinated fashion. Consider …

VISIONTRAIN

Scientific coordinator and contact : Radu Horaud, INRIA Grenoble Rhône-Alpes, France The VISIONTRAIN proposal (19 November 2003) | The VISIONTRAIN technical annex (16 December 2004) | EU funding: 3.46M€ VISIONTRAIN (Computational and Cognitive Vision Systems) was an European Marie-Curie Research Training Network (MRTN-CT-2004-005439) granted for the period from 1 May 2005 to 30 April 2009. Summary: The VISIONTRAIN research training …

EARS

Embodied Audition for Robots (EARS)  EARS explores new algorithms for enhancing the auditive capabilities of humanoid robots. A main focus is to develop the fundamentals for a natural spoken dialogue between humans and robots in adverse acoustical environments EARS publications by the PERCEPTION team   The European FP7 STREP project EARS started on 1 January 2014 …

ERC VHIA

Vision and Hearing In Action ERC Advanced Grant #340113 VHIA studies the fundamentals of audio-visual perception for human-robot interaction VHIA’s list of publications | Research | Recently Submitted Papers | News Principal Investigator: Radu Horaud | Duration: 1/2/2014 – 31/1/2019 (five years) | ERC funding: 2,497,000€ Research pages (please click here for a complete list of our research …

H2020 SPRING

Socially Pertinent Robots in Gerontological Healthcare SPRING is an EU H2020-ICT research and innovation action (RIA) whose main objective is the development of socially assistive robots with the capacity of performing multimodal multiple-person interaction and open-domain dialogue. SPRING explores new methods at the crossroads of machine learning, computer vision, audio signal processing, spoken dialog and robotics …

MIXCAM

 Real-Time Visual Reconstruction by Mixing Multiple Depth and Color Cameras Summary | Datasets | People | Publications | Videos | MIXCAM Laboratory | MIXCAM Software The MIXCAM project (Real-Time Visual Reconstruction by Mixing Multiple Depth and Color Cameras)  was funded by  France’s Agence Nationale de la Recherche (ANR), programme BLANC. The project started on February 1, 2014 …

PoC VHIALab

Vision and Hearing In Action Laboratory ERC Proof of Concept #767064 VHIALab develops audio-visual machine perception software for human-robot-interaction Principal Investigator: Radu Horaud | Duration: 1/2/2018 – 31/1/2019 (12 months) | ERC funding: 150000 €   Summary: The objective of VHIALab is the development and commercialization of software packages enabling a robot companion to easily and naturally interact …

POP

EU STREP: Perception on Purpose     Scientific coordinator and contact : Radu Horaud, INRIA Grenoble Rhône-Alpes, France Read a short report on POP in ICT-results : Robotic perception, on purpose The POP proposal (21 March 2005) | The POP technical annex (30 September 2005) | The POP Final report (20 March 2009) | EU funding: 2.6M€ Perception on Purpose (POP) was a …

Popeye

  The POPEYE robotic head was originally developed by the Robotics group of the University of Coimbra in the framework of the European project POP (Perception on Purpose, FP6-IST-027268, January 2006 – December 2008). Its main components are a four degree of freedom motor system, a stereoscopic camera pair, and an acoustic dummy head with two microphone plugged into …

Popeye+

The Popeye+ audiovisual head is a follow-up of Popeye. Popeye+ is built around an acoustic dummy head mounted on a tripod and supporting binaural hearing (two microphones plugged into the ears) and binocular vision (a stereoscopic camera pair with a wide field of view). Popeye+ has the following hardware specifications: The acoustic dummy head is …

The MIXCAM Laboratory

BEWARE : Obsolete platform The MIXCAM laboratory is a multiple-camera multiple-PC hardware/software platform that combines high-resolution color (RGB) cameras with low-resolution time-of-flight (TOF) cameras. The cameras are arranged in “units”, where each unit is composed of two RGB cameras and one TOF camera (left image). Currently the system is composed of four such units (right …