Published
This document provides guidance on the design, selection and optimization of non-contacting hand and arm gestures for human-computer interaction. It addresses the assessment of usability and fatigue associated with different gesture set designs and provides recommendations for approaches to evaluating the design and selection of gestures. This document also provides guidance on the documentation of the process for selecting gesture sets.
This document applies to gestures expressed by humans. It does not consider the technology for detecting gestures or the system response when interpreting a gesture. Non-contacting hand gestures can be used for input in a variety of settings, including the workplace or in public settings and when using fixed screens, mobile, virtual reality, augmented reality or mixed-mode reality devices.
Some limitations of this document are:
— The scope is limited to non-contacting gestures and does not include other forms of inputs. For example, combining gesture with speech, gaze or head position can reduce input error, but these combinations are not considered here.
— The scope is limited to non-contacting arm, hand and finger gestures, either unilateral (one-handed) or bilateral (two-handed).
— The scope assumes that all technological constraints are surmountable. Therefore, there is no consideration of technological limitations with interpreting ultra-rapid gestures, gestures performed by people of different skin tones or wearing different colours or patterns of clothing.
— The scope is limited to UI-based command-and-control human computer interaction (HCI) tasks and does not include gaming scenarios, although the traversal of in-game menus and navigation of UI elements is within scope.
— The scope does not include HCI tasks for which an obviously more optimal input method exists. For example, speech input is superior for inputting text than gesture input.
— The scope includes virtual reality (VR), augmented reality (AR) and mixed reality (MR) and the use of head-mounted displays (HMDs).
— The scope does not include the discoverability of gestures but does include the learnability and memorability of gestures. It is assumed that product documentation and tutorials will adequately educate end users about which gestures are possible. Therefore, assessing gesture discoverability is not a primary goal of the recommendations in this document.
PUBLISHED
ISO/TS 9241-430:2021
60.60
Standard published
Dec 6, 2021
Only informative sections of projects are publicly available. To view the full content, you will need to members of the committee. If you are a member, please log in to your account by clicking on the "Log in" button.