Abstract

This document aims to outline user needs and requirements for people with disabilities, and users of Assistive Technologies when using Immersive, Augmented and Mixed Reality environments.

Introduction

This document is developed as part of a discovery into accessibility related user needs and requirements for XR. This document does not represent a formal working group position, nor does it currently represent a set of technical requirements that a developer or designer need strictly follow. It aims to outline for the reader some of the diversity of current accessibility related user needs in XR and what potential requirements to meet those needs may be.

What does the term 'XR' mean?

As with the WebXR API spec, this document uses the acronym XR to refer to the spectrum of hardware, applications, and techniques used for Virtual Reality, Augmented Reality, and other related technologies. Examples include, but are not limited to:

The important commonality between them being that they all offer some degree of spatial tracking with which to simulate a view of virtual content as well as navigation and interaction with the objects within these environments.

Terms like "XR Device", "XR Application", etc. are generally understood to apply to any of the above. Portions of this document that only apply to a subset of these devices will be indicated as appropriate.

Definitions of Virtual Reality

Virtual Reality definitions vary but converge on the notion of immersive computer mediated experiences. They involve interaction with objects, people and environments using a range of controls. These experiences are often multi-sensory and may be used for educational, therapeutic or entertainment purposes.

Definitions of Augmented Reality

Augmented Reality definitions vary but converge on the notion of computer mediated interactions involving overlays on the real world. These may be informational, or interactive depending on the application.

What is XR used for?

XR has an exhaustive range of purposes from education, gaming, multimedia, immersive communication and many others. It is currently evolving at a very fast rate and is not yet mainstream. This will change as computing power increases, hardware improves as well as the quality of user experience, XR will be more commonly be used for the performance of everyday practical tasks, for therapeutic uses, education and for entertainment.

Understanding XR and accessibility challenges

Understanding XR, Mixed Reality and so on presents various challenges that are very technical. They include issues with hardware, software, interaction design, design principles, semantics and more. So these are the 'basic' technical complexities that are substantial. To add to this, for many designers and authors they may neither know or have access to people with disabilities or a way of understanding user needs that they can build a solid set of requirements from. In short, they just may not understand what user needs they are trying to meet when making XR accessible.

Some of the issues in XR, for example in gaming, for people with disabilities including:

There are also a range of other disabilities that will need to be considered in making XR accessible. It is beyond the scope of this document to describe them all in detail. General categories or types of disabilities are:

A person may have one of these disabilities or a combination of several. Each of these 'types' will be presented as a user need that should be met and understanding these needs are crucial in rising to the range of interesting challenges XR designers and authors will have when supporting accessibility and multimodality in XR environments.

These may be:

Immersive Environment challenges

Some of the many challenges with immersive environments accessibility (and also gaming) include the use of extremely complex input devices, control schemes that require a high degree of precision, timing and simultaneous action; ability to distinguish subtle differences in busy visual and audio information, having to juggle multiple complex goals and objectives [[web-adapt]].

There are also currently very useful accessibility guidelines available that are specific to gaming [[game-a11y]].

XR and Supporting Multimodality

Modality relates to modes of sense perception such as sight, hearing, touch and so on. Accessibility can be thought of as supporting multi-modal requirements and the transformation of content or aspects of a user interface from one mode to another that will support various user needs.

Considering various modality requirements in the foundation of XR means these platforms will be better able to support accessibility related user needs. There will be many modality aspects for the developer and/or content author to consider:

NOTE: XR authors and content designers will also need access to tools that support the multimodal requirements listed below.

The following Inputs and Outputs can be considered modalities that should be supported in XR environments.

Various Input Modalities

The following are example of some of the diverse input methods used by people with disabilities. NOTE: In many real world applications these input methods may be combined.

Various Output Modalities

The following are a list of outputs that can be available to a user to help them understand, interact with and 'sense' feedback from an XR application. Some of these are in common use on the Web and other exploratory (such as Olfactory and Gustatory.)

XR controller challenges

As mentioned there are a range of input devices that may be used. Supporting these controllers requires an understanding of what they are and how they work. There are a variety of alternative gaming controls that may be very useful in XR environments and applications. For example the XBOX Adaptive Controller.

While XR is the experience, the controller is king, and plays a critical part in overcoming some complexity as well as mediating issues that may relate to other challenges around usability and helping the user understand sensory substitution devices.

Controllers such as the XBOX Adaptive Controller and other switch type inputs allow the issuer to remapping keyboard inputs to control virtual environments. The powerful customisations may allow the user to "do that thing that is difficult" for them with ease. In conjunction with this controller, for example, users with limited mobility they can also simulate actions in the XR environment that they would not be able to physically perform. WalkinVRDriver is a good example of this where motion range, position and orientation can be set to the users ability.

Customisation of control inputs

Giving the user the ability to modify their input preference or use a variety of input devices. The remapping of keys used to control movement or interaction in virtual environments is not currently required by WCAG. It is nevertheless noted in the literature as desirable.

Using multiple diverse inputs simultaneously

A user with a disability may have several input devices. A user may switch 'mode' of interaction or the tools used and should be able to do so without degrading into a poor user experience where they lose focus on a task and cannot return to it, or make unforced errors, accidental input and so on.

Consistent tracking with multiple inputs

There may be tracking issues when switching input devices. A tracking issue is where the user may lose their focus or it can be modified in unpredictable or unwanted ways, this can cause loss of focus and potentially push the user to make unwanted inputs or choices.

Outputs sent to multiple devices will need to be synchronised.

Usability and affordances in XR

An XR application should have a high level of usability for someone with a disability who is using AT. Some challenges in translating interaction models may be:

XR User Needs and Requirements

User Needs definition

This document outlines various accessibility related 'user needs' for XR. These 'user needs' should drive accessibility requirements for XR and its related architecture. These come from people with disabilities who use Assistive Technology (AT) and wish to see the features described available within XR enabled applications.

User needs and requirements are often dependent on context of use. The following outline some accessibility user needs and requirements that may be applicable in immersive environments, augmented reality and 3600 applications.

These following are neither exhaustive, nor definitive but are presented in order to help orientate the reader towards understanding some broad user needs and how to meet them.

Immersive Semantics and customisation

Motion agnostic interactions

Immersive Personalisation

Interaction and Target customisation

Voice Commands

Colour changes

Magnification Context and resetting

Critical Messaging and Alerts

Gestural interfaces and interactions

Text description transformation

Safe harbour controls

Immersive Time limits

Reset Focus and orientation

Routing to Second Screens

Interaction Speed

Avoiding sickness triggers

Binaural Audio track alternatives

Subtitling customisation

Related Documents

Other documents that relate to this and represent current work in the RQTF/APA are:

  • XR Semantics Module - this document outlines proposed accessibility requirements that may be used in a modular way in Immersive, Augmented or Mixed Reality (XR). A modular approach may help us to define clear accessibility requirements that support XR accessibility user needs, as they relate to the immersive environment, objects, movement, and interaction accessibility. Such a modular approach may help the development of clear semantics, designed to describe specific parts of the immersive eco-system. In immersive environments it is imperative that the user can understand what objects are, understand their purpose, as well as another qualities and properties including interaction affordance, size, form, shape, and other inherent properties or attributes.
  • WebXR Standards and Accessibility Architecture Issues - this document is informative and aims to outline some of the challenges in understanding the complex technical architecture and processes behind how XR (Virtual, Augmented and Mixed reality) environments are currently rendered. To make these environments accessible and provide a quality user experience it is important to also understand the nuances and complexity of accessible user interface design and development for the 2D web. Any attempt to make XR accessible also needs to be based on meeting the practical user needs of people with disabilities (outlined in this document)
  • Acknowledgements

  • Joshue O Connor - W3C
  • Judy Brewer - W3C
  • Michael Cooper - W3C
  • Janina Sajka - Invited Expert
  • Léonie Watson - Tetralogical
  • Charles LePierre - Benetech
  • Scott Hollier - Edith Cowan University & Centre For Accessibility
  • Jason White - Associate Research Scientist at Educational Testing Service(ETS)
  • Marku Hakkinen - Digital Accessibility at Educational Testing Service (ETS)
  • Charles Hall - Senior UX Architect MRM McCann
  • Matthew Tylee Atkinson - The Paciello Group
  • Ian Hamilton - Independent
  • Melina Möhlne - IRT
  • This work is supported by the EC-funded WAI-Guide Project