Empowering Accessible Gesture Input Design with Gesture-A11Y

Abstract: Understanding end-user performance with gesture input is essential for designing intuitive and effective interactions. Unfortunately, open gesture datasets are scarce, in particular those addressing users with impairments, which hinders advancements in accessible and inclusive user interface design for devices and applications featuring gesture interactions. To address this, we introduce Gesture-A11Y, a web-based tool with a large database aimed to help identify gestures aligning with users’ abilities and preferences. Gesture-A11Y offers access to over 22,000 records of touchscreen, mid-air, on-body, and on-wheelchair gestures performed by users with various visual and/or motor abilities, along with their preferences and perceptions of gesture input across different mobile and wearable devices.

Authors: Mihail Terenti, Laura-Bianca Bilius, Ovidiu-Ciprian Ungurean, Radu-Daniel Vatavu

Conference: W4A ’25, the 22nd International Web for All Conference

Publication: ACM, New York, NY, USA

Recognition: Nominated for the Best Communication Paper Award.

Linkhttps://dx.doi.org/10.1145/3744257.3744267