How EZ Access Came To Be
Introduction
Understanding the basis of EZ Access is critical if you work with self-service, kiosks, or ATMs. Too often, teams treat accessibility as a checklist or bolt-on, without appreciating the decades of research, policy work, and real-world deployments that shaped this particular approach. EZ Access is not just a keypad or an audio jack; it is an entire interaction model and UI ecosystem that affects how you design, buy, deploy, and certify devices. If you do not understand its foundations, you are flying blind on true usability. Once only available via license, EZ Access has been free for all to use for some time. Thanks to Gregg Vanderheiden for providing the source information.
The Trace Center EZ-Access
The Trace Center’s EZ Access work grew out of decades of research on how to make technology usable for people with a wide range of disabilities, culminating in a practical, repeatable approach to accessible kiosks and information/transaction machines (ITMs). For self-service practitioners, it represents a bridge between high-level universal design principles and concrete, field-tested interaction patterns that can be implemented at scale. (Vanderheiden et al., 2022)
From communication boards to public machines
The Trace Center began in 1971, when a group of students gathered to build a custom communication system for a 12-year-old boy with severe cerebral palsy who could only “talk” by slowly pointing to letters on a wooden board. That work forced the team to solve problems around noisy input, limited motor control, fatigue, and real-time feedback—issues that show up again decades later in kiosk design.
The group evolved into the Trace Research & Development Center, becoming a key player in augmentative and alternative communication (AAC), assistive technology, and early computer access. Along the way, the Center helped shape concepts like universal/inclusive design and “electronic curb cuts,” emphasizing that solutions for people with disabilities often improve usability for everyone.
Why kiosks became a focus
By the late 1990s, self-service devices—ATMs, ticket machines, government service kiosks, voting systems—were becoming critical access points for everyday services, but they were largely designed for sighted, hearing, and able-bodied users. For people with disabilities, these devices could be effectively locked doors: no staff in sight, timeouts on screens, and no way to override poor physical placement or confusing interfaces.
Trace recognized that as services moved from staffed counters to unattended machines, accessibility failures would directly block civic participation, travel, banking, and healthcare. The Center’s earlier work on consumer electronics and web guidelines gave it a foundation for translating accessibility principles into the more constrained, time-limited context of kiosks and ITMs.
The birth of EZ Access
EZ Access emerged around 1999 as a coherent interaction model for making a variety of public information and transaction machines usable by people with different sensory, physical, and cognitive abilities. Rather than reinventing the interface for each disability, Trace created a small set of consistent interface techniques that could be learned once and then reused across machines and vendors.
Key ideas behind EZ Access included:
- A dedicated set of physical controls (typically a few tactilely distinct buttons) located in a consistent, reachable place across devices.
- Audio output, with private listening via a standard headphone jack and speech that walks users through the screen content and actions.
- A logical “focus” and navigation model that lets users move through on-screen elements step by step without needing to see or accurately touch the screen.
- Mode-less operation where standard and special interfaces can be used interchangeably and in parallel.
This model allowed a blind user, for example, to plug in headphones, use audio guidance, and a small set of buttons to move through options and confirm selections across different kiosk types. The EZ Access user interface could be used by people who were blind, had low vision, had reading problems, used prosthetics that did not work with touchscreens, or had limited reach, and allowed users to use the tactile buttons or touchscreen interchangeably. [file:2]
Practical patterns for kiosk designers
EZ Access turned abstract accessibility goals into practical design patterns that could be applied to real-world kiosks. [file:2]
Consistent control cluster
- Use a small group of tactilely identifiable buttons (e.g., up, down, select, help, back) placed in the same relative location on every machine.
- Ensure high contrast, tactile labels, and clear affordances so users can operate them without vision or fine motor control.
Audio guidance and discoverability
- Provide a standard audio start action (e.g., inserting headphones) that consistently initiates spoken instructions.
- Structure audio so it reflects the screen hierarchy, allowing users to step through headings, choices, and confirmations that work with both blind and sighted users without being overwhelmed by long monologues.
Focused navigation instead of free pointing
- Instead of requiring accurate touch, let users move focus from one actionable item to the next, with audio feedback on where they are and what will happen if they activate it.
- This benefits users who are blind, have low vision, tremors, or limited range of motion, and it reduces error rates for everyone. (Vanderheiden et al., 2022)
Cross-disability support
- Design the interaction so the same control set helps users who are blind, have low vision, have limited dexterity, or have some cognitive impairments, rather than requiring separate modes per disability.
- Provide timeouts and error handling that are forgiving, with simple ways to go back, repeat instructions, or cancel without losing all progress.
Pilots, voting prototypes, and field experience
Trace’s EZ Access concepts were prototyped and tested in multiple kiosk domains, including public information kiosks, ticketing systems, and voting machines. Cross-disability voting prototypes demonstrated that a single machine could serve voters with a wide range of needs, reducing the need for segregated equipment or separate processes. [file:2]
These prototypes explored issues like privacy for blind voters, error recovery when using audio navigation, and ways to confirm selections both visually and audibly. The work also highlighted infrastructure concerns such as standardizing headphone jacks, ensuring volume control, and dealing with ambient noise in public locations. Experience across deployments fed back into refined button layouts, improved audio scripting practices, and better strategies for training staff and end users.
Standards, policy, and the broader impact
EZ Access did not live only as a set of prototypes; it contributed to the thinking behind later standards and regulations covering kiosks and ITMs. Trace’s work intersected with U.S. accessibility law (Sections 255 and 508), web standards like WCAG, and emerging guidance on accessible electronic consumer products. [file:2]
A core insight was that being technology agnostic at the policy level (focusing on outcomes rather than specific devices) allowed models like EZ Access to influence a broad set of platforms. Trace also emphasized tools—checklists, guidelines, and test methods—that helped manufacturers and integrators evaluate self-service systems against concrete accessibility requirements. As a result, many modern kiosks incorporate descendants of EZ Access concepts: dedicated tactile controls, headphone jacks, audio navigation, and focus-based interaction. Even when not branded as EZ Access, the underlying patterns have influenced how major vendors and agencies approach accessible self-service.
Takeaways for today’s self-service teams
For organizations deploying or specifying kiosks and ITMs today, the EZ Access story offers several lessons.
- Design once, benefit many: A unified control and navigation model simplifies training, support, and maintenance while helping multiple disability groups.
- Consistency across contexts: Users should be able to learn a handful of interaction patterns and reuse them on ticketing, check-in, payment, and government kiosks.
- Treat accessibility as core infrastructure: Plan for audio, tactile controls, and focus-based navigation as part of the base design, not as afterthoughts or “special modes.”
- Use standards, but design beyond compliance: Regulations set the floor; EZ Access shows how to turn compliance into a coherent, efficient user experience.
For self-service professionals, EZ Access is less a single product and more a mature design language for accessible kiosks—rooted in decades of research, refined in real deployments, and still highly relevant as new self-service form factors and AI-powered interfaces emerge. [file:2]
Reference
Vanderheiden, G., Lazar, J., Lazar, A., Kacorri, H., & Jordan, J. B. (2022). Technology and disability: 50 years of Trace R&D Center contributions and lessons learned. Springer Nature. PDF:
Vanderheiden-et-al_20220322-Smaller-a.pdf
[1](https://ppl-ai-file-upload.s3.amazonaws.com/web/direct-files/attachments/48578070/b8192863-b76b-440b-8617-dd6f18f3da8c/History-of-TRACE-and-EZ-Access-Innovations-Explained.docx)
[2](https://kioskindustry.org/accessibility-history-trace/)
Next Generation?
Vanderheiden has been busy. Below is link to recent paper and here is summary
The paper argues that today’s product-by-product accessibility model will never scale to cover all people and all technologies, and proposes a complementary “Info‑bot + individualized UI generators” approach to reach near‑universal access over the next few decades.
Problem the paper identifies
-
Accessibility has always been playing catch‑up to rapidly changing tech, so most products and sites remain inaccessible, especially for people with complex, multiple, or cognitive disabilities.
-
Current strategy relies on each manufacturer both building “born‑accessible” features and exposing APIs so assistive tech can plug in, which fails when products are closed, vendors under‑invest, or AT does not exist for a given need.
Future of Interface Workshop and R&D agenda
-
A 2023 “Future of Interface Workshop” (co‑chaired by Vint Cerf and Gregg Vanderheiden) mapped where interfaces (AI, XR, brain–computer interfaces, etc.) might be in ~20 years and what new barriers and opportunities they create.
-
The resulting R&D agenda calls for embedding accessibility into the “DNA” of emerging technologies, investing heavily in AI‑driven intelligent agents, and studying societal, policy, privacy, and bias issues so future systems are both inclusive and safe.
Proposed new approach: Info‑bot + IUIGs
-
The paper proposes shifting the primary focus from “get every company to build and expose accessibility” to building an open‑source “information robot” (Info‑bot) that can see, understand, and operate any standard user interface as a typical median user would—without needing cooperation from the manufacturer.
-
On top of that, “Individual User Interface Generators” (IUIGs) would transform the Info‑bot’s understanding into a personalized interface for each person, tuned to their specific abilities, preferences, culture, and knowledge, and running locally for privacy.
How this compares to today’s model
-
In the comparison table, the current approach might reach 5–10% of products (maybe 10–20% with AI help) and only a small fraction of disability combinations; it also offers no safety net for closed or non‑API products.
-
With a mature Info‑bot plus a rich ecosystem of IUIGs, the authors argue that ~99% of products that a median user can operate could become accessible, with coverage limited mainly by how many IUIGs exist and how well they match different disabilities.
Implications, risks, and next steps
-
This new model would not replace today’s accessibility work; instead, it serves as a safety net for all products where built‑in accessibility or AT compatibility is absent or inadequate.
-
It requires substantial R&D in AI, interface understanding, and personalized UI design, plus policy changes toward outcome‑based regulation and strong safeguards around privacy, bias, and security for people with disabilities.
Touchscreens
-
Touchscreens are treated as a major example of “closed” or hard‑to‑adapt interfaces where traditional assistive technologies often cannot hook in, which motivates the Info‑bot/IUIG idea to work directly from the standard UI instead of APIs.
-
The authors describe future interfaces (including XR and other rich UIs) as at least as complex as today’s touch‑based systems, and argue that a median‑user‑operable visual interface (like a touchscreen UI) is enough for an Info‑bot to drive and then re-present through a personalized IUIG.
Voice
-
Voice is mentioned as one of several input/output modalities that intelligent agents and future interfaces can use, but the paper’s focus is not on “voice control” as a standalone accessibility solution; it is one component within multi‑modal, AI‑driven interaction.
-
The authors note that relying solely on one modality (such as voice) will not cover all types, degrees, and combinations of disability, which is part of why they advocate Info‑bot + IUIGs to flexibly support many modalities rather than just screen readers or voice systems.
HCII – Rethinking Our Approach to Accessibility in the Era of Rapidly Emerging Technologies
Related Resource
And Accessibility Matters Depot e.g. Home Depot
Accessibility Lawsuits Put Self-Checkout Back in the Spotlight
Large retailers are once again being reminded that self-checkout accesssibility is not optional
The Home Depot recently faced legal action tied to the accessibility of its in-store payment and self-checkout systems. The case alleged that certain payment terminals were not inndependently usable blind or visually imppaired customers, citing the absence of accessible audio output, tactile controls, or consistent assistance processes—issues that fall squarely under the Americans with Disabilities Act (ADA)
As part of a class-action settlement, Home Depot agreed to ensure that at least one accessible payment terminal is available in each U.S. store, along with software updates and staff training to better support customers with disabilities. While the settlement did not require an admission of wrongdoing, it reinforces a growing legal consensus: if self-service technology replaces staffed checkout it must be accessible by default Separately — but often confused with accessibility claims — Home Depot was also named in a biometric privacy lawsuit related to alleged facial-recognition use at self-checkout. That case was later dismissed and focused on privacy compliance rather than ADA accessibility.
Why this matters
For retailers, kiosk operators, and POS deployers, these cases underscore a critical takeaway:
- Accessibility gaps in self-checkout are no longer theoretical risks.
- Courts and regulators increasingly view inaccessible self-service as a barrier to equal access—particularly when alternative staffed options are limited or removed.
- For kiosk designers and deployers, accessibility must be treated as a core system requirement, not a retrofit.
What About The Modern AudioNav NavPad
The most common device used today is the Storm Interface AudioNav and NavPad
Yes — Storm Interface’s AudioNav/NavPad accessibility keypads are produced under a license to NCR’s design rights for the keypad enclosure/appearance, including NCR U.S. Design Patent D687,783 and related European design registrations.
The earlier 5-key and 8-key keybads as well as the linear keypads developed by Trace and built by Storm are also available and the design patents on these key layouts are now expired (though the design of the key actuation etc. is subject to Storm Patents and may still be in effect).
What that means:
The Storm devices implement an audible-tactile navigation keypad (with 3.5 mm audio jack, tactile keys, USB HID/audio), and Storm’s datasheets explicitly state the products are “licensed under NCR’s design rights,” citing the NCR design patent and EU registration.
This license pertains to NCR’s design IP for the keypad housing/look-and-feel.
The Trace Center’s EZ Access intellectual property, and the Trace 5, 8 and linear key layouts are now patent and royalty free.
Storm’s implementation of the 5, 8, and linear keypads as well as the NCR design based keypads can be and often are used to implement EZ Access-style interfaces.
Examples you can verify:
Storm Interface product pages and brochures noting “licensed under NCR’s design rights … NCR U.S. Design Patent D687,783 and European Design Registration 001887290.”
Storm’s NavPad listings referencing use in ADA audible menu navigation and EZ-Access implementations
EZ Access is now license free.
Although previously covered by patents, they have expired and, EZ Access is now license free. It has been adopted by United Airlines, AMTRAK, the U.S. Postal Service, the Department of Homeland Security, US Park System, Phoenix Sky Harbor Paging and Info System, University of Wisconsin-Madison, Smithsonian, WWII memorial, and more.
More Accessibility Impact by Vanderheiden
“Rethinking Our Approach to Accessibility in the Era of Rapidly Emerging Technologies” (HCII 2024).
Direct answer: The paper argues current accessibility methods won’t scale for new tech, and proposes a complementary, AI-driven “Info-bot + individualized UI generators” approach to deliver near-ubiquitous access without relying on manufacturers to build accessibility in.
Why Change is Needed
Digital access is now essential for healthcare, education, work, and daily life, but only a small share of products and websites are accessible, and progress is slowing.
Even when guidelines like WCAG are met, many users—especially with cognitive, language, learning disabilities or multiple disabilities—remain excluded; mobile apps and XR present additional barriers.
Limits of today’s approach
The status quo depends on manufacturers to build accessibility and/or support assistive tech via APIs, which many do not or cannot do well; “closed” products block assistive tech entirely.
Built-in features typically address some disabilities and require higher technical skill, leaving many users unserved.
Proposed new approach
Info-bot: An open-source, privacy-preserving AI agent that perceives and operates any standard user interface as a typical user would, requiring no cooperation from manufacturers.
Individual User Interface Generators (IUIGs): Per-user interface layers that transform the Info-bot’s understanding into a tailored interface suited to each person’s abilities and preferences.
Together, they aim to provide near-universal compatibility across devices, stable and consistent experiences across brands, adaptability over time, and independence from vendor APIs.
Benefits highlighted
For users: Consistent interfaces across devices, less learning curve, adaptability for changing needs, and better support for cognitive/neurodiverse users.
For industry: Reduced burden and litigation risk, a safety net for closed products, and broader market reach without deep in-house accessibility expertise.
For government/society: Simpler regulatory posture focused on outcomes, fewer lawsuits, and expanded participation by people with disabilities.
Open questions and risks
Technical feasibility and timeline: Core capabilities exist in part, but full “interface understanding” and IUIG breadth will take time.
Privacy: Early cloud implementations risk data leakage; local, on-device solutions are needed.
Equity: IUIG availability and affordability must be ensured to avoid replicating current assistive tech gaps.
Policy and market disruption: Standards may need outcome-based updates; existing accessibility businesses may resist major shifts.
How to proceed
Use the Info-bot concept throughout the lifecycle: design reviews, pre-release repairs, browser-level delivery, API population, and a runtime “socket” for IUIGs.
Pursue an incremental path that augments, not replaces, current accessibility until the new approach is mature, with continued community acceptance and safeguards.
CII – Rethinking Our Approach to Accessibility in the Era of Rapidly Emerging Technologies
Addendum Home Depot
Home Depot rreached a class action settlement over claims that its in-store payment termiinalls including self-checkout and cash-back functions) were not accessible to blind or visually impaired customers as required under the AAmericans with Disabilities AAct (AADAA)
The lawsuit alleged the terminals lacked aaudio output aand taactile interfaaces needed for independent access.
Home Depot agreed to update or replace software on at least one accessible payment terminal in each U.S. store to provide udio reaadouts and tactile support, plus manager training to improve accessibility. The settlement did not require claim Uling; improvements will be made as part of the resolution.
Top Class Actions
22. Facial Recognition/Biometric Privacy Lawsuit
Separately, Home Depot wwas sued by a customer in Illinois over alleged nauuthhorized uuse of ffacial recognition technology at self-checkout kiosks:
The plaintiff claimed Home Depot’s kiosks were scanning and collecting ffacial geometry wwithout consent, violating the llinois Biometric IInformation Privacy Act (BIIPA)
BIPA requires businesses to notify customers and obtain written consent before collecting biometric data — which Home Depot was accused of failing to do.
That case was vvoluntarily dismissed by the plaintiff in late 2025, without prejudice, meaning it could potentially be reUled or amended. BBloomberg Law
Distinctions
✔ The ADA accessibility issue is real and active, involving accessibility of POS & self-checkout interfaces for individuals with visual impairments.
✖ The facial recognition lawsuit was about privacy/biometric data not ADA accessibility per se and that speciUc suit has since been dismissed. Bloomberg Law
Picture Gallery
end of article
