Coleção de Trabalhos Apresentados em Eventos
URI Permanente para esta coleção
Navegar
Navegando Coleção de Trabalhos Apresentados em Eventos por Autor "ANDREW TOSHIAKI NAKAYAMA KURAUCHI"
Agora exibindo 1 - 3 de 3
Resultados por página
Opções de Ordenação
Trabalho de Evento Active learning approaches applied in teaching agile methodologies(2023) GRAZIELA SIMONE TONIN; FABIO ROBERTO DE MIRANDA; ANDREW TOSHIAKI NAKAYAMA KURAUCHI; Montagner, Igor; Agena, Barbara; Barth, Fabrício J.We need to modernize education to form adaptable leaders who can tackle evolving challenges in our dynamic world. Insper's computer science program is designed to reflect this need with an innovative infrastructure, curriculum, and industry partnerships. We use active learning methodologies to teach agile methodologies and develop soft skills to solve real-world problems. Our focus is on non-violent communication, feedback techniques, and teamwork, along with constant interaction with industry professionals who share their experiences with students. Our goal is to provide students with a well-rounded education that equips them for success in the digital age. This work-in-progress research project describes our approach to teaching and our objective to prepare students for the future in the context of an innovative first semester experience on a CS program.Trabalho de Evento EyeSwipe: Dwell-free Text Entry Using Gaze Paths(2016) ANDREW TOSHIAKI NAKAYAMA KURAUCHI; Feng, Wenxin; Joshi, Ajjen; Morimoto, Carlos; Betke, MargritText entry using gaze-based interaction is a vital communication tool for people with motor impairments. Most solutions require the user to fixate on a key for a given dwell time to select it, thus limiting the typing speed. In this paper we introduce EyeSwipe, a dwell-time-free gaze-typing method. With EyeSwipe, the user gaze-types the first and last characters of a word using the novel selection mechanism “reverse crossing.” To gaze-type the characters in the middle of the word, the user only needs to glance at the vicinity of the respective keys. We compared the performance of EyeSwipe with that of a dwell-time-based virtual keyboard. EyeSwipe afforded statistically significantly higher typing rates and more comfortable interaction in experiments with ten participants who reached 11.7 words per minute (wpm) after 30 min typing with EyeSwipe.- HGaze Typing: Head-Gesture Assisted Gaze Typing(2021) Feng, Wenxin; Zou, Jiangnan; ANDREW TOSHIAKI NAKAYAMA KURAUCHI; Morimoto, Carlos; Betke, MargritThis paper introduces a bi-modal typing interface, HGaze Typing, which combines the simplicity of head gestures with the speed of gaze inputs to provide efficient and comfortable dwell-free text entry. HGaze Typing uses gaze path information to compute candidate words and allows explicit activation of common text entry commands, such as selection, deletion, and revision, by using head gestures (nodding, shaking, and tilting). By adding a head-based input channel, HGaze Typing reduces the size of the screen regions for cancel/deletion buttons and the word candidate list, which are required by most eye-typing interfaces. A user study finds HGaze Typing outperforms a dwell-time-based keyboard in efficacy and user satisfaction. The results demonstrate that the proposed method of integrating gaze and head-movement inputs can serve as an effective interface for text entry and is robust to unintended selections.