On Demographic Bias in Fingerprint Recognition
Document Type
Article
Publication Title
arXiv
Abstract
Fingerprint recognition systems have been deployed globally in numerous applications including personal devices, forensics, law enforcement, banking, and national identity systems. For these systems to be socially acceptable and trustworthy, it is critical that they perform equally well across different demographic groups. In this work, we propose a formal statistical framework to test for the existence of bias (demographic differentials) in fingerprint recognition across four major demographic groups (white male, white female, black male, and black female) for two state-of-the-art (SOTA) fingerprint matchers operating in verification and identification modes. Experiments on two different fingerprint databases (with 15,468 and 1,014 subjects) show that demographic differentials in SOTA fingerprint recognition systems decrease as the matcher accuracy increases and any small bias that may be evident is likely due to certain outlier, low-quality fingerprint images. Copyright © 2022, The Authors. All rights reserved.
DOI
10.48550/arXiv.2205.09318
Publication Date
5-19-2022
Keywords
Gears, Machine learning, Palmprint recognition, Demographic groups, Fingerprint database, Fingerprint matcher, Fingerprint Recognition, Fingerprint recognition systems, Low qualities, Personal devices, State of the art, Statistical framework, Two-state
Recommended Citation
A. Godbole, S.A. Grosz, K. Nandakumar, and A.K. Jain, "On Demographic Bias in Fingerprint Recognition", 2022, arXiv:2205.09318
Comments
IR Deposit conditions: non-described
Preprint available on arXiv