Acoustic Tracking of Moving Speakers, by Christine Evers

Monday, March 23, 2015, 3:00pm to 4:00pm, room C207, INRIA Montbonnot

Seminar by Christine Evers, Imperial College, London

 

Localisation and bearing-only acoustic tracking of moving speakers for robot audition

 

Abstract. Robot audition for Human-Robot Interaction (HRI) is a complex and largely unsolved problem. Audition is particularly useful for HRI applications in situations where visual sensors suffer from limited Field of View or object occlusions. Audio-visual fusion hence combines the advantages of both vision and audition. For this purpose, robust audio processing systems are required. Spherical microphone arrays integrated in a robot head present a natural placement of acoustic sensors for interaction with humans. The first part of this talk therefore introduces an approach to speaker localization using pseudo-spherical microphone arrays. The second part of the talk focuses on acoustic tracking of speakers. Acoustic speaker tracking in enclosed spaces is subject to missing detections and spurious clutter measurements due to speech inactivity, reverberation and interference. Furthermore, many acoustic localization approaches estimate directions of arrival, hence providing bearing-only measurements without range information. An approach to bearing-only acoustic tracking in the presence of clutter and missing detections is presented using the probability hypothesis density (PHD) filter.