Gender Privacy Angular Constraints for Face Recognition

Research output: Contribution to journalArticleAcademicpeer-review

24 Downloads (Pure)


Deep learning-based face recognition systems produce templates that encode sensitive information next to identity, such as gender and ethnicity. This poses legal and ethical problems as the collection of biometric data should be minimized and only specific to a designated task. We propose two privacy constraints to hide the gender attribute that can be added to a recognition loss. The first constraint relies on the minimization of the angle between gender-centroid embeddings. The second constraint relies on the minimization of the angle between gender specific embeddings and their opposing gender-centroid weight vectors. Both constraints enforce the overlapping of the gender specific distributions of the embeddings. Furthermore, they have a direct interpretation in the embedding space and do not require a large number of trainable parameters as two fully connected layers are sufficient to achieve satisfactory results. We also provide extensive evaluation results across several datasets and face recognition networks, and we compare our method to three state-of-the-art methods. Our method is capable of maintaining high verification performances while significantly improving privacy in a cross-database setting, without increasing the computational load for template comparison. We also show that different training data can result in varying levels of effectiveness of privacy-enhancing methods that implement data minimization.
Original languageEnglish
Pages (from-to)1-1
Number of pages1
JournalIEEE Transactions on Biometrics, Behavior, and Identity Science
Publication statusE-pub ahead of print/First online - 17 Apr 2024


  • Face recognition
  • Training
  • Privacy
  • Minimization
  • Task analysis
  • Biological system modeling
  • Training data


Dive into the research topics of 'Gender Privacy Angular Constraints for Face Recognition'. Together they form a unique fingerprint.

Cite this