ANDREW TOSHIAKI NAKAYAMA KURAUCHI
Projetos de Pesquisa
Unidades Organizacionais
Resumo profissional
Área de pesquisa
Nome para créditos
2 resultados
Resultados de Busca
Agora exibindo 1 - 2 de 2
- HGaze Typing: Head-Gesture Assisted Gaze Typing(2021) Feng, Wenxin; Zou, Jiangnan; ANDREW TOSHIAKI NAKAYAMA KURAUCHI; Morimoto, Carlos; Betke, MargritThis paper introduces a bi-modal typing interface, HGaze Typing, which combines the simplicity of head gestures with the speed of gaze inputs to provide efficient and comfortable dwell-free text entry. HGaze Typing uses gaze path information to compute candidate words and allows explicit activation of common text entry commands, such as selection, deletion, and revision, by using head gestures (nodding, shaking, and tilting). By adding a head-based input channel, HGaze Typing reduces the size of the screen regions for cancel/deletion buttons and the word candidate list, which are required by most eye-typing interfaces. A user study finds HGaze Typing outperforms a dwell-time-based keyboard in efficacy and user satisfaction. The results demonstrate that the proposed method of integrating gaze and head-movement inputs can serve as an effective interface for text entry and is robust to unintended selections.
Trabalho de Evento EyeSwipe: Dwell-free Text Entry Using Gaze Paths(2016) ANDREW TOSHIAKI NAKAYAMA KURAUCHI; Feng, Wenxin; Joshi, Ajjen; Morimoto, Carlos; Betke, MargritText entry using gaze-based interaction is a vital communication tool for people with motor impairments. Most solutions require the user to fixate on a key for a given dwell time to select it, thus limiting the typing speed. In this paper we introduce EyeSwipe, a dwell-time-free gaze-typing method. With EyeSwipe, the user gaze-types the first and last characters of a word using the novel selection mechanism “reverse crossing.” To gaze-type the characters in the middle of the word, the user only needs to glance at the vicinity of the respective keys. We compared the performance of EyeSwipe with that of a dwell-time-based virtual keyboard. EyeSwipe afforded statistically significantly higher typing rates and more comfortable interaction in experiments with ten participants who reached 11.7 words per minute (wpm) after 30 min typing with EyeSwipe.