Tactile User Interfaces for Virtual Reality using High-Density Electrotactile Feedback

Location: Inria Rennes / IRISA
Starting Date: From October 2020
Duration: up to six months


  • Ferran Argelaguet, (Hybrid), ferran.argelaguet@inria.fr
  • Maud Marchal, (Rainbow), maud.marchal@irisa.fr
  • Claudio Pacchierotti, (Rainbow), claudio.pacchierotti@irisa.fr


Research in haptics has been notably increasing in the last years, with the objective to bring wearable haptic systems to the public, provide high-fidelity feedback and natural like sensations has the potential to revolutionize VR industry. For example, Microsoft and Facebook have underwent a notable research effort in the last years in that direction, with numerous research in wearable systems and perceptual studies.

From existing haptic devices, tactile displays are the best placed candidates to achieve this goal. Tactile displays provide feedback to the user by stimulating the skin mechanically (e.g., vibration motors) or electrically to simulate physical properties or to convey information. For example, they are commonly used as feedback interfaces in virtual reality applications, teleoperation, as well as in prosthetics, to provide sensory information from the missing limb. Although a wide range of tactile display systems exist, such as ultrasound [1], air streams [2], and pin-matrix [3], they are either too complex to scale to large workspaces, or can render just basic information (e.g. notifications) [4].

In contrast, electrotactile displays can provide tactile feedback with high resolution and/or communicate multiple variables simultaneously [5]. Electrotactile displays deliver low-intensity electrical current to the skin in order to activate cutaneous nerve fibers and elicit tactile sensations. Electrotactile interfaces are simple structure, low-power consumption and low cost, since there are no moving mechanical elements. Furthermore, they can integrate a large number of tactile electrodes and allow independent modulation of stimulation parameters (e.g., location, intensity and frequency).


The objective of the internship is to lay the foundations of a new generation of “Tactile” User Interfaces (TUIs) leveraging high-density electrotactile feedback in the user’s palm. While the main usages of tactile feedback are targeted to the enhancement of dexterous interaction (e.g. grasping virtual objects), other promising usages are envisioned for the enhancement of other virtual reality tasks, such as object selection [6,7], manipulation [8], virtual navigation [9] or application control tasks [10]. Current interaction techniques mainly rely on visual information in order to drive the perception-action loop. By introducing tactile feedback, we aim to enrich the information exchange between the virtual environment and the user in order to increase user’s awareness on the interaction state. Furthermore, the ability to render tactile feedback, will enable “blind” interactions as the user would be able to interact without directly looking the interface.

Candidate Requirements

The candidate should be comfortable with as much following items as possible:

  • Experience in the development of 3D/VR applications (e.g. Unity3D).
  • Knowledge on haptics or human-computer interaction
  • Good spoken and written English.
  • Good communication skills.


[1] T. Carter, S. Seah, B. Long, B. Drinkwater and S. Subramanian, “UltraHaptics: multi-point mid-air haptic feedback for touch surfaces,” in Proceedings of the 26th annual ACM symposium on User interface software and technology, pp. 505–514. 2013. https://doi.org/10.1145/2501988.2502018
[2] R. Sodhi, I. Poupyrev, M. Glisson and A. Israr, AIREAL: interactive tactile experiences in free air. ACM Trans. Graph. 32, 4, Article 134 (July 2013), 10 pages. https://doi.org/10.1145/2461912.2462007
[3] M. Shimojo, M. Shinohara and Y. Fukui, “Human shape recognition performance for 3D tactile display,” IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, vol. 29, no. 6, pp. 637-644, 1999. https://doi.org/10.1109/3468.798067
[4] L. A. Jones and N. B. Sarter, “Tactile displays: Guidance for their design and application,” Human factors, vol. 50, no. 1, pp. 90-111, 2008. https://doi.org/10.1518%2F001872008X250638
[5] J. Hummel, J. Dodiya, L. Eckardt, R. Wolff, A. Gerndt, T. W. Kuhlen. A lightweight electrotactile feedback device for grasp improvement in immersive virtual environments. IEEE Virtual Reality (pp.39-48). 2016. https://doi.org/10.1109/VR.2016.7504686
[6] Argelaguet, F., & Andujar, C. (2013). A survey of 3D object selection techniques for virtual environments. Computers & Graphics. https://doi.org/10.1016/j.cag.2012.12.003
[7] E. Charoenchaimonkon, P. Janecek, M. N. Dailey and A. Suchato, “A comparison of audio and tactile displays for non-visual target selection tasks,” 2010 International Conference on User Science and Engineering (i-USEr), Shah Alam, 2010, pp. 238-243, https://doi.org/10.1109/IUSER.2010.5716759
[8] Girard, A., Marchal, M., Gosselin, F., Chabrier, A., Louveau, F., & Lécuyer, A. (2016). Haptip: Displaying haptic shear forces at the fingertips for multi-finger interaction in virtual environments. Frontiers in ICT, 3, 6. https://doi.org/10.3389/fict.2016.00006
[9] A. Cassinelli, C. Reynolds and M. Ishikawa, “Augmenting spatial awareness with Haptic Radar,” 2006 10th IEEE International Symposium on Wearable Computers, Montreux, 2006, pp. 61-64, https://doi.org/10.1109/ISWC.2006.286344
[10] Smyth, T. N., & Kirkpatrick, A. E. (2006, November). A new approach to haptic augmentation of the GUI. In Proceedings of the 8th international conference on Multimodal interfaces (pp. 372-379). https://doi.org/10.1145/1180995.1181064

Comments are closed.