We propose a classification technique for gaze-written numbers as the hands-free interface. Since the gaze-writing is less accurate compared to the virtual keyboard typing, we apply Convolutional Neural Network (CNN) deep learning algorithm to recognize the gaze-writing and improve the classification accuracy. Besides, we create new gaze-writing datasets for training, gaze MNIST (gMNIST), by modifying the MNIST data with features of the gaze movement patterns. For the evaluation, we compare our approach with the basic CNN structures using the original MNIST dataset. Our study will allow us to have more options for the input interfaces and expand our choices in the hands-free environments.
Paper : Yoo, S., Jeong, D. K., & Jang, Y. (2019). The Study of a Classification Technique for Numeric Gaze-Writing Entry in Hands-Free Interface. IEEE Access, 7, 49125-49134.
Patent: Jang Y., Yoo S., Jeong D. K., "Apparatus for analyzing user input data based on gaze Tracking and method thereof", Korean Patent 10-1987227 , June 3, 2019
We propose a novel visualization applying the smudge technique to the attention map. The proposed visualization intuitively shows the gaze flow and AoIs (Area of Interests) of an observer. Besides, it provides fixation, saccade, and micro-movement information, which allows us to respond to various analytical goals within a single visualization. Finally, we provide two case studies to show the effectiveness of our technique.
Paper: Yoo, S., Jeong, S., Kim, S., & Jang, Y. (2019). Gaze Attention and Flow Visualization using the Smudge Effect. The 27th International Conference on Computer Graphics and Applications (Pacific Graphics 2019), Short paper.
Patent: Jang Y., Yoo S., Kim, S. Y., Jeong D. K., "Method and apparatus for analyzing saliency-based visual stimulus and gaze data", Korean Patent 10-1987229, June 3, 2019