HGaze Typing: head-gesture assisted gaze typing
Files
Published version
Date
2021-05-25
Authors
Feng, Wenxin
Zou, Jiangnan
Kurauchi, Andrew
Morimoto, Carlos H.
Betke, Margrit
Version
Published version
OA Version
Citation
W. Feng, J. Zou, A. Kurauchi, C.H. Morimoto, M. Betke. 2021. "HGaze Typing: Head-Gesture Assisted Gaze Typing." ACM Symposium on Eye Tracking Research and Applications. ETRA '21: 2021 Symposium on Eye Tracking Research and Applications. https://doi.org/10.1145/3448017.3457379
Abstract
This paper introduces a bi-modal typing interface, HGaze Typing, which combines the simplicity of head gestures with the speed of gaze inputs to provide efficient and comfortable dwell-free text entry. HGaze Typing uses gaze path information to compute candidate words and allows explicit activation of common text entry commands, such as selection, deletion, and revision, by using head gestures (nodding, shaking, and tilting). By adding a head-based input channel, HGaze Typing reduces the size of the screen regions for cancel/deletion buttons and the word candidate list, which are required by most eye-typing interfaces. A user study finds HGaze Typing outperforms a dwell-time-based keyboard in efficacy and user satisfaction. The results demonstrate that the proposed method of integrating gaze and head-movement inputs can serve as an effective interface for text entry and is robust to unintended selections.
Description
License
© 2021 Copyright held by the owner/author(s). This work is licensed under a Creative Commons Attribution International 4.0 License.