Project Status

Good Accessibility, Handcuffed Creativity: AI-Generated UIs Between Accessibility Guidelines and Practitioners’ Expectations

This study examines AI-powered UI tools and their capacity to generate accessible designs, offering insights into their limitations, identifying key barriers, and proposing ways to better integrate AI into creative design workflows.

Gesture-A11Y: A Large-Scale Hub for Accessible Gesture Input

This article presents Gesture-A11Y, a web-based, open-source tool and database with over 22,000 gesture records from users with visual or motor disabilities, supporting inclusive and accessible gesture interface design. Given its merits, this research paper was awarded the Accessibility Challenge Judges Award.

Empowering Accessible Gesture Input Design with Gesture-A11Y

This article, which presents Gesture-A11Y — a gesture data collector from users with disabilities to support inclusive design — was shortlisted for the Best Communication Paper Award.

When LLM-Generated Code Perpetuates User Interface Accessibility Barriers, How Can We Break the Cycle?

The paper evaluates how well LLMs like ChatGPT and Claude generate accessible web interfaces. Accessibility-focused prompts improve results, but challenges remain—especially in semantic structure—highlighting the need for deeper LLM understanding and context awareness.

Insights and Implications of Evaluating Accessibility Compliance in AI-Generated Web Interfaces

The study evaluates how well AI design tools meet accessibility standards, revealing moderate issues—mainly with text contrast and target size—and unexpectedly finds that accessibility-focused prompts may reduce, rather than improve, compliance unless refined through iteration.

Breaking Bad (Design): Challenging AI User Interface Accessibility Guardrails

The study investigates how AI design tools handle requests to create intentionally inaccessible interfaces, revealing that these tools are bound by their training in usability and lack the flexibility to deviate thoughtfully based on context.

Distal-Haptic Touchscreens: Understanding the User Experience of Vibrotactile Feedback Decoupled from the Touch Point

The study explores how haptic feedback delivered to different body areas affects the user experience with touchscreen interaction, highlighting the potential of using distal locations such as the wrist or abdomen.

Intermanual Deictics: Uncovering Users' Gesture Preferences for Opposite-Arm Referential Input, from Fingers to Shoulder

This study investigates intermanual deictic gestures—those involving one hand pointing or interacting with the opposite arm—to understand user preferences. Findings highlight a strong inclination toward physical-contact gestures using the index finger, informing the design of intuitive interactive systems.