Feng, WenxinZou, JiangnanANDREW TOSHIAKI NAKAYAMA KURAUCHIMorimoto, CarlosBetke, Margrit2024-11-182024-11-182021https://repositorio.insper.edu.br/handle/11224/7222ETRA '21 Full Papers | ACM Symposium on Eye Tracking Research and ApplicationsThis paper introduces a bi-modal typing interface, HGaze Typing, which combines the simplicity of head gestures with the speed of gaze inputs to provide efficient and comfortable dwell-free text entry. HGaze Typing uses gaze path information to compute candidate words and allows explicit activation of common text entry commands, such as selection, deletion, and revision, by using head gestures (nodding, shaking, and tilting). By adding a head-based input channel, HGaze Typing reduces the size of the screen regions for cancel/deletion buttons and the word candidate list, which are required by most eye-typing interfaces. A user study finds HGaze Typing outperforms a dwell-time-based keyboard in efficacy and user satisfaction. The results demonstrate that the proposed method of integrating gaze and head-movement inputs can serve as an effective interface for text entry and is robust to unintended selections.Digitalp. 1 - 11InglêsText entryDwell-free typingMulti-modal text entryEye trackingHead gesturesHGaze Typing: Head-Gesture Assisted Gaze Typing10.1145/3448017.345737