(Closed) MSc. Project on Gazeable Objects

MSc project on “Gazeable Objects”

Duration: about 6 months

Short description: Gaze is the direction towards which a person is looking. The automatic estimation of the gaze from a single image and from videos has been a hot research topic in previous years [1-4]. Often, researchers studied gaze from a human-centered perspective, trying to answer the question “where are people looking” or “what are people looking at.” In this Masters thesis we propose to investigate an orthogonal direction: we would like to understand the automatic recognition of gazeable objects. In practice, this would mean to estimate the areas of the image from which a particular object can be gazed. The combination of this research with state-of-the-art methods on “standard” gaze estimation should boost the performance on many real-world applications, like human-robot interaction or end-to-end active recognition.

Environment: This project will be carried out in the Perception Team, at Inria Grenoble Rhône-Alpes, and in collaboration with University of Granada. The research progress will be closely supervised by Dr. Xavier Alameda-Pineda, Dr. Pablo Mesejo-Santiago and Dr. Radu Horaud, head of the Perception Team. At the perception team we have the necessary computational resources (GPU & CPU) to carry on the proposed research.

References:
[1] Recasens, A., Khosla, A., Vondrick, C., & Torralba, A. (2015). Where are they looking?. In NIPS.
[2] Zhang, X., Sugano, Y., Fritz, M., & Bulling, A. (2015). Appearance-based gaze estimation in the wild. In CVPR.
[3] Recasens, A., Vondrick, C., Khosla, A., & Torralba, A. (2017). Following gaze in video. In ICCV.
[4] Chong, E., Ruiz, N., Wang, Y., Zhang, Y., Rozga, A., & Rehg, J. (2018). Connecting Gaze, Scene, and Attention: Generalized Attention Estimation via Joint Modeling of Gaze and Scene Saliency. In arXiv.