top of page

Keywords

Keywords

Partners | Funders

Ultraleap

Publications

Neate, T., Alvares Rodrigues De Souza Maffra, S., Frier, W., You, Z., & Wilson, S. (Accepted/In press). Using Mid-Air Haptics to Guide Mid-Air Interactions. In 19th International Conference of Technical Committee 13 (Human- Computer Interaction) (INTERACT) Springer. https://kclpure.kcl.ac.uk/ws/portalfiles/portal/225129157/Mid_Air_Haptics_Guidance_INTERACT_2023_CAMERA_READY.pdf

Project Summary

Mid-air interfaces are becoming increasingly prevalent in everyday life, from controlling car radios with gestures to immersive interactions in virtual reality (VR). However, interacting with these interfaces can be challenging, as they lack the physical cues of traditional input devices. Without a physical form factor, users often struggle to understand how to interact effectively. This project explores how mid-air haptics – ultrasound-based tactile feedback projected directly onto the skin – can guide gestural input.


Through iterative design and development, we created mid-air haptic stimuli that convey specific gestures without the need for accompanying visual or verbal instructions. These haptic cues aim to guide users, opening possibilities for interaction in environments where visual feedback is unavailable or to complement existing visual prompts. Our findings have implications for the design of interactive systems, especially VR, and automotive interfaces.


This project has laid the foundation for future work, where we aim to build upon existing City research (the GReAT project), to consider how we might support remote communication contexts (e.g. videoconferencing) for users with aphasia, who might benefit from gestural interactions when communicating.

Project Members at City

Mid-Air Haptics

Guiding Mid-Air Interactions with Mid-Air Haptics
Black and white image of a hand
bottom of page