Application of Eye Tracking for People with Disabilities
Abstract
In this work, we propose an eye tracking system that mocks an operating system that employs
eye-gazing as the primary mode of user interaction. Such a system which once calibrated does
not need limb functions for inputs can be really helpful as an assistive technology for people
with limb function disability. Not only does the application use eye movement as the primary
source of input it can be used to trigger certain events based on the facial expression of the
user. The proof of concept software is built as a web application in HTML, CSS, and JavaScript
due to its wide availability and ease of use. In future however, such a system can be fully built
as an Operating System specific software for native use. The core of the software uses
WebGazer.js and CLMTracker.js for detecting and predicting the gaze of the user. Once such
a system is built, it can not only be used on desktops but for Virtual Reality (VR) and
Augmented Reality (AR) applications.