Modeling Kinesthetic and Tactile Properties of Virtual Environments for Immersive Experiences

Joint Laboratory between Inria and InterDigal: 

The PhD candidate will join the NEMO.AI laboratory, a joint laboratory dedicated to research and applications in Virtual Reality in the context of the Metaverse, gathering Inria and Interdigital company.

Context

The democratization of virtual reality and augmented reality has strongly re-invigorated the research on haptic devices for enriching immersive experiences. However, the wide range of available haptic devices (see Figures 1 to 4 for some examples of different types of haptic devices) raises the need of providing standardized methods to encode and decode haptic signals, and author haptic experiences [Danieau 2012, Danieau 2018, Li 2021]. While, kinesthetic feedback is strongly coupled with the physical simulation underneath, tactile rendering is mainly defined by ad-hoc authoring processes. For example, as for today, there is no obvious, generalized way to provide haptic properties to a virtual object, and most haptic rendering setups rely on custom and specific data formats. Even “holistic” systems [Kammermeier 2004][Yang 2005][Drif 2008], aiming at an exhaustive combination of haptic actuators, did not clearly address the question of holistic haptic data. This lack of standard representation impedes the whole computer haptics pipeline, from acquisition to rendering. A common, standardized way of defining haptic data would help to unify the approaches, simplifying the authoring and ensuring interoperability.

Furthermore, in the realm of immersive virtual reality, the users’ virtual representation (i.e. its avatar) is now ubiquitous and it becomes the medium which enables users to interact with the virtual environment and also the main source of tactile and kinesthetic sensations. It can also be used to modulate the perceived haptic sensations [Jauregui 2014], enabling a wider range of sensations than the ones provided solely by haptic actuators. In this context, haptic rendering is strongly coupled with both the users’ actions and capabilities, which has been rarely addressed in the haptics literature.

Main Goal

The goal of this PhD is to propose a complete haptics pipeline dedicated to augmented and virtual reality experiences. The pipeline should be agnostic on the haptic device used and will integrate both the definition of the haptic properties of the virtual environment and the interaction characteristics and preferences of the user. The rendering of the haptic properties should not only be driven by the virtual object properties, but also by the actions of the user in the virtual environment [Vizcay 2022]. Thus, the interaction capabilities of the user should be taken into account [Dewez 2021]. Finally, haptic sensations should be congruent with the actual user’s actions in order to avoid the potential generation of an uncanny valley [Berger 2018], which is strongly linked with the notions of presence [Skarbez 2017] and virtual embodiment [Kilteny 2012].

Objectives

The main tasks conducted in this PhD will be:

  1. Study the state of the art of haptic authoring and virtual environment definition.
  2. Propose novel methods to encode and decode haptic properties of the virtual environment, taking into account, not only the authoring of the haptic properties but also the interaction capabilities of the user.
  3. Develop a number of proof of concepts to showcase the proposed methods.
  4. Conduct user evaluations to ensure the viability of the proposed methods.

Requirements for candidacy

  • Experience in Virtual Reality, Computer Graphics or Haptics
  • Experience on Unity/C#
  • Fluent in English (reading, writing, and speaking)

Contacts

We are looking for motivated candidates, please send CV, a motivation letter, reference letters, and any relevant material to: ferran.argelaguet@inria.fr, anatole.lecuyer@inria.frquentin.galvane@interdigital.com and philippe.guillotel@interdigital.com.

References

Dewez, D., Hoyet, L., Lécuyer, A., & Argelaguet Sanz, F. (2021, May). Towards “avatar-friendly” 3D manipulation techniques: Bridging the gap between sense of embodiment and interaction in virtual reality. ACM CHI Conference on Human Factors in Computing Systems (pp. 1-14).

Kammermeier, P., Kron, A., Hoogen, J., and Schmidt, G.: Display of holistic haptic sensations by combined tactile and kinesthetic feedback. Presence: Teleoperators and Virtual Environments, 13(1), pp. 1-15 (2004).

Yang, G. H., Kyung, K. U., Jeong, Y. J., and Kwon, D. S.: Novel haptic mouse system for holistic haptic display and potential of vibrotactile stimulation. IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1980-1985 (2005).

Drif, A., Le Mercier, B., and Kheddar, A.: Design of a Multilevel Haptic Display. The Sense of Touch and its Rendering, pp. 207-224 (2008).

Costes, A., Danieau, F., Argelaguet, F., Lécuyer, A., & Guillotel, P. (2018, June). Haptic material: A holistic approach for haptic texture mapping. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (pp. 37-45). Springer, Cham.

Okamoto, S., Nagano, H., and Yamada, Y.: Psychophysical dimensions of tactile perception of textures. IEEE Transactions on Haptics, 6(1), pp. 81-93 (2012).

Vizcay, S., Kourtesis, P., Argelaguet, F., Pacchierotti, C., & Marchal, M. (2022). Design and evaluation of electrotactile rendering effects for finger-based interactions in virtual reality. ACM Symposium on Virtual Reality Software and Technology (pp. 1-11).

Kadlecek, Petr. “Overview of current developments in haptic APIs.” Proceedings of CESCG 2011 (2011).

Sourin, Alexei, and Lei Wei. “Visual immersive haptic rendering on the web.” Proceedings of the 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry. 2008.

Danieau, F., Lécuyer, A., Guillotel, P., Fleureau, J., Mollet, N., & Christie, M. (2012). Enhancing audiovisual experience with haptic feedback: a survey on HAV. IEEE transactions on haptics, 6(2), 193-205.

Danieau, F., Guillotel, P., Dumas, O., Lopez, T., Leroy, B., & Mollet, N. (2018, November). HFX studio: haptic editor for full-body immersive experiences. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology (pp. 1-9).

Li, Y., Yoo, Y., Weill-Duflos, A., & Cooperstock, J. (2021, December). Towards Context-aware Automatic Haptic Effect Generation for Home Theatre Environments. In Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology (pp. 1-11).

Jauregui, D. A. G., Argelaguet, F., Olivier, A. H., Marchal, M., Multon, F., & Lecuyer, A. (2014). Toward” pseudo-haptic avatars”: Modifying the visual animation of self-avatar can simulate the perception of weight lifting. IEEE transactions on visualization and computer graphics, 20(4), 654-661.

Berger, C. C., Gonzalez-Franco, M., Ofek, E., & Hinckley, K. (2018). The uncanny valley of haptics. Science Robotics, 3(17), eaar7010.

Kilteni, K., Groten, R., & Slater, M. (2012). The sense of embodiment in virtual reality. Presence: Teleoperators and Virtual Environments, 21(4), 373-387.

Skarbez, R., Brooks, Jr, F. P., & Whitton, M. C. (2017). A survey of presence and related concepts. ACM Computing Surveys (CSUR), 50(6), 1-39.


Reminder of Ys.AI project: Specification of physical capabilities of objects and characters for interaction engines. Today interaction engines rely on specifically designed rules and actions based on the targeted rendering device. It would be more efficient to associate the interaction properties to the object, scene or character. It is then the rendering engine that will adapt/convert the properties to the capabilities of the device. Such interaction properties are currently considered in existing formats such as glTF or MPEG, but it is still simple interaction models  based on collision

Comments are closed.