A Study of Gender bias in Face Presentation Attack and Its Mitigation

Thumbnail Image

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Saudi Digital Library

Abstract

Facial data-based technology has become one of the most used techniques in biometric systems. It is used to identify and verify individuals in many private and governmental corporations. The nature of system’ resources being authorized to access or the importance of users being identified using those systems makes them targets for cyber attacks. The process of identifying or verifying people using such systems must be strictly accurate to ensure a high level of security and credibility. Recently, the fairness of facial data-based technologies such as face recognition has become doubtful. Many researchers investigated the fairness of face recognition systems and reported demographic bias. However, there was not much study in face presentation attack detection technology (PAD) in terms of bias. This thesis sheds light on bias in face spoofing detection systems by implementing two phases. Firstly, two state-of-the-art CNN-based presentation attack detection models, ResNet50 and VGG16 were used to closely evaluate the fairness of detecting imposer attacks on the basis of gender. In addition, different sizes of Spoof in the Wild (SiW) testing and training data with different gender distributions were used in the first phase to study the effect of gender distribution on the models’ performance. Secondly, the Debiasing Variational Autoencoder (DB-VAE) algorithm [1] was applied in combination with VGG16 to assess its ability to mitigate bias in presentation attack detection systems. Our experiments exposed minor gender bias in CNN-based presentation attack detection methods. In addition, it was proven that imbalance in training and testing data doesn’t necessarily lead to gender bias in the model’s performance. Results proved that the DB-VAE approach [1] succeeded in mitigating bias in detecting spoof faces.

Description

Keywords

Citation

Endorsement

Review

Supplemented By

Referenced By

Copyright owned by the Saudi Digital Library (SDL) © 2025