Overview
Today's human-computer interfaces are typically designed based on several assumptions: 1) that they are going to be used by an able-bodied individual, 2) who is using a typical set of input and output devices, and 3) who is sitting in a stable, warm environment. Any deviation from these assumptions, be it a hand tremor due to aging, use of an eye tracker, or the jostling caused by riding on a bus, may drastically hamper the person's effectiveness---not due to any inherent barrier to interaction, but because of a mismatch between the user's effective abilities and the assumptions underlying the interface design.
In contrast to the current design practice, we advance our vision of ability-based user interfaces: we believe that user interfaces should be designed with each person's unique abilities, devices, and environment in mind. Because of the myriad of distinct individuals, each with his or her own abilities, devices, and environments, many such personalized interfaces are needed.
To address the scalability challenge, we have developed SUPPLE, a system for automatically generating user interfaces adapted to a person's abilities and preferences. The results of our study with 11 participants with motor impairments demonstrate that such automatically generated ability-based interfaces significantly improve both performance and satisfaction compared to the default user interfaces shipped with today's software.
Projects
Several specific projects are part of our effort to develop personalized ability-based user interfaces.- SUPPLE is a system for automatically generating user interfaces adapted to people's abilities, tasks, preferences, and devices. In this project we use SUPPLE to overcome the scale challenge when needing to deliver personalized interfaces to a large number of individuals with unique abilities and devices.
- In order to generate personalized ability-based interfaces, SUPPLE needs a model of that person's actual abilities. Until now, we built such models by engaging the user in an explicit diagnostic task. While it needs to be done only once (unless the user's abilities change), it can take a substantial amount of time and effort to complete. We are currently working on unobtrusively modeling users' motor abilities by observing their natural interactions with the computer. We believe that such unobtrusive modeling will make personalized ability-based interfaces more pracital and will also enable re-adaptation as the person's abilities change over time.
- Controlling Complex Applications with a Brain-Computer Interface In search for interaction methods for severely paralyzed users, we are currently starting a project to explore the properties and limitations of one particularly promising brain-computer interaction paradigm as an input modality and to develop methods and tools for designing user interfaces for complex brain-controlled applications. Most of the prior effort in brain-computer interface research has been directed at developing better sensors and better ways of extracting useful information from the brain signal, while little work has been directed at systematically understanding the unique strengths and limitations of this input modality and their implications for interaction design. We believe that these properties need to be well understood for the brain-computer interface technology to significantly impact the quality of life of users with severely limitted mobility.
- Crossing-based User Interfaces We have contributed to a study of area pointing and goal crossing for people with and without motor impairments. In goal crossing, users do not acquire a confined area, but instead pass over a target line. Although goal crossing has been studied for able-bodied users, its suitability for people with motor impairments is unknown. In our study, participants with motor impairments were faster with and preferred goal-crossing to area pointing. This work provides the empirical foundation from which to pursue the design of crossing-based interfaces as accessible alternatives to pointing-based interfaces.