Is Consent a Sufficient Means for Protecting Facial Biometric Data?
No Thumbnail Available
Date
2023-09
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
The University of Sheffield
Abstract
Deleuze and Guattari prophesized in 1987, through their theory of faciality, that an individual is not born with a new face, they instead slide into one1. This prophecy is paralleled in the growing integration of facial recognition technologies (FRT) across various private sector domains such as workplaces2, schools3, retail4, and social media platforms5. FRTs aim to enhance efficiency in employee management6, boost student focus7, and provide personalized experiences for consumers8 and users9. However, the extraction and processing of facial biometric data (FBD) allow for hacking10, data scraping11, and identity theft12. Moreover, facial data extraction erodes the concept of obscurity13. Obscurity is defined as the barrier between the individual’s inside and outside world14, it shields individuals from recognition in their daily lives15.
As FRT integration grows, opting out becomes increasingly challenging16. Despite the General Data Protection Regulation (GDPR) prohibiting FBD processing in the private sector17; consent is an exception to this rule18. Despite its prevalent use as a legitimate legal instrument, its porous and feeble nature is largely overlooked by the literature. The GDPR constitutes that for consent to be valid, it must be “freely given, specific, informed, and an unambiguous indication of the data subject’s wishes19...” Yet, the unbiased results of this study indicate that consent forms often overwhelm users with information, failing to convey risks adequately.20 As a result, the user does not fully fathom the consequences of their agreement to these forms21. Meanwhile, imbalances of power between the data subject and controller undermine the capability of consent to be freely given23. Consequently, consent becomes a mere procedural requirement rather than a protective measure for FBD processing. This study aims to illustrate the limitations of consent in safeguarding facial biometric data, emphasizing user vulnerability. Furthermore, it underscores the potential dangers of FRTs, including algorithmic bias and corporate misuse of personal data. Moreover, the algorithm dictates and supplies corporate entities with behavioral profiles of individuals without involving the individual or allowing them the ability to determine their own identity24. This research ultimately calls for a critical reassessment of current practices, it emphasizes the need for stricter regulations and greater user empowerment in controlling their personal data.
Description
Keywords
Personal data, GDPR, Theory of Faciality, Data Protection Law, Consent
Citation
Osailan HW, 'Is Consent a Sufficient Means for Protecting Facial Biometric Data? ' (L.L.M, University of Sheffield 2023)