Investigating and Mitigating Privacy Risks of Eye Tracking on Handheld Mobile Devices

dc.contributor.advisorKhamis, Mohamed
dc.contributor.authorAlsakar, Noora Sami
dc.date.accessioned2025-12-24T11:23:02Z
dc.date.issued2025
dc.description.abstractEye-tracking technology offers significant benefits for mobile device users by enabling handsfree interaction and providing valuable insights into user behavior. However, it also raises serious privacy concerns, as gaze data can reveal sensitive information about individuals, such as gender, age, and geographical origin. Under two threat models, this thesis investigates and addresses the privacy risks of eye tracking on handheld mobile devices, a context that presents distinct challenges and remains underexplored compared to other eye-tracking platforms. While extensive research has examined eye-tracking privacy risks in controlled settings, such as desktop or virtual reality (VR), handheld mobile devices have not been investigated. Yet, handheld mobile devices, such as smartphones, are widely used in everyday life scenarios, making the privacy implications of eye tracking on these platforms critical. Furthermore, the literature lacks evaluations of how effective existing privacy mitigation methods are when applied to gaze data collected via the front-facing cameras. In the dataset attack threat model scenario, an attacker gains unauthorized access to the gaze data, whether raw or aggregated, and then attempts to infer sensitive data about individuals, such as the user’s age, gender, and nationality, from the retrieved gaze data using a pre-trained privacy attack AI model given some basic assumptions. In the model attack threat model scenario, the adversary gains unauthorized access to the utility AI model, which refers to the useful task the model was developed to perform, such as predicting user IQ performance from their eye-tracking data, and then performs inference attacks by probing the model, potentially reconstructing the original training data and leaking private information. Leveraging machine learning techniques, this thesis explores a) the privacy leakage via handheld mobile eye tracking and b) examines differential privacy (DP) mechanisms to mitigate these risks, which is a rigorous mathematical and formal approach to protecting private data by adding noise. The overarching goal is to protect user privacy in mobile eye tracking through rigorous empirical investigation, progressing through three main phases: understanding user privacy perception and subjective acceptance of eye tracking on handheld mobile devices, measuring privacy leakage, and developing privacy-preserving solutions tailored to mobile eye tracking. The thesis begins by exploring user perceptions of privacy in the context of eye tracking on handheld mobile devices. The study reveals a key finding: most users are unaware of the privacy risks associated with these technologies. This underscores the urgent need to integrate privacy-preserving techniques. Additionally, the study identifies critical factors influencing users’ subjective acceptance of mobile eye tracking, such as algorithmic transparency and the credibility of developers. Building on these findings, the second study confirms that sensitive attributes, such as gender, age, nationality, and educational background, can be inferred from the processed gaze data collected via handheld mobile devices. This is demonstrated through the introduction of a novel contribution: the SmartEyePhone dataset, collected using the front-facing camera of an iPhone 14 Pro. This dataset is the first to associate mobile gaze data with sensitive demographic information. To address the associated privacy risks of the processed gaze data in line with the two threat models, the third study evaluates two differential privacy (DP) approaches: dataset perturbation, where noise is injected into the data before training to prevent attackers from inferring private information resulting from unauthorized access to the data, and model perturbation, where noise is added to the model’s gradients during the training phase, which prevents attackers from rebuilding the original gaze data the models trained on, resulting from unauthorized access to the model that can be used to leak sensitive information about individuals. Both methods are shown to effectively reduce privacy risks, though with an impact on utility, such as decreased classification accuracy in tasks like IQ task prediction. Additionally, another study introduces a novel privacy-utility tradeoff metric that quantifies the privacy protection through the reduction in the privacy leakage after applying DP mechanisms and the utility performance loss, guiding the selection of optimal DP parameters in mobile scenarios. A subsequent study focuses on raw gaze data, particularly x,y coordinate streams, to examine their privacy implications. Unlike processed gaze data, which are aggregated into higher-level features such as fixations or saccades, raw gaze data represent the most granular form of the eyetracking input and are directly collected and transmitted by mobile eye tracking applications. Investigating the privacy risks of raw gaze data is essential to mitigate the privacy risks at the earliest stage of the gaze data collection pipeline, where sensitive information may be exposed. The findings of this chapter confirm that even raw gaze data can reveal sensitive information. However, applying appropriate DP mechanisms can mitigate these risks. In the final phase of the thesis, a SelectiveGazeDP approach is proposed, inspired by insights from earlier studies. Chapters 4 and 5 showed that while applying DP mechanisms can effectively reduce privacy risks in handheld mobile settings, it often leads to a considerable drop in utility performance. Based on this insight, this method employs feature selection techniques to identify the most privacy-revealing processed gaze features and selectively applies DP noise to those features. The findings from this targeted application were shown to enhance the utility in mobile settings but compromise privacy. Overall, this thesis provides a novel and comprehensive investigation of privacy leakage risks in gaze data, both raw and processed, when collected via handheld mobile eye trackers. One of the key contributions is the introduction of the SmartEyePhone dataset, which demonstrates that mobile eye tracking can indeed expose sensitive personal information. Building on these findings, the thesis explores effective privacy-preserving techniques tailored to mobile settings, emphasizing the importance of selecting appropriate differential privacy parameters. Finally, it introduces a novel privacy-utility tradeoff metric that quantifies the tradeoff between privacy protection and utility task performance, offering practical guidance for designing privacy-aware gaze-based applications.
dc.format.extent166
dc.identifier.urihttps://hdl.handle.net/20.500.14154/77697
dc.language.isoen
dc.publisherSaudi Digital Library
dc.subjectEye Tracking
dc.subjectPrivacy
dc.subjectDifferential privacy
dc.subjectMobile devices
dc.subjectSmartphones
dc.titleInvestigating and Mitigating Privacy Risks of Eye Tracking on Handheld Mobile Devices
dc.typeThesis
sdl.degree.departmentCollege of Science and Engineering
sdl.degree.disciplineComputing Science
sdl.degree.grantorUniversity of Glasgow
sdl.degree.nameDoctor of Philosophy

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
SACM-Dissertation.pdf
Size:
25.8 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.61 KB
Format:
Item-specific license agreed to upon submission
Description:

Copyright owned by the Saudi Digital Library (SDL) © 2026