Antoine Deleforge, Multispeech team, Inria Nancy Grand-Est
Wednesday, 15 June 2022, 15:30, room F107, Inria Montbonnot Saint-Martin
Attend online: https://inria.webex.com/inria/j.php?MTID=m30df5cc25af1cc7f052683154f4f7638
Abstract: Close your eyes, clap your hands. Can you hear the shape of the room? Is there carpet on the floor? Answering these peculiar questions may have applications in acoustic diagnosis, audio augmented reality and hearing aids devices. In this talk, we will see how machine learning, physics and signal processing can be jointly leveraged to tackle these difficult inverse problems. In particular, we will introduce the unifying methodological framework of “virtual acoustic space learning”, review some of its recent promising results and discuss some of its current bottlenecks.