As companies race to employ facial recognition everywhere from major league ballparks to your local school and summer camp, we face tough questions about the technology’s potential to intensify racial bias; Commercial face recognition software has repeatedly been shown to be less accurate on people with darker skin, and civil rights advocates worry about the disturbingly targeted ways face-scanning can be used by police.
The researchers found that when various face recognition algorithms were tasked with identifying gender, they miscategorized dark-skinned women as men up to a 34.7 percent of the time. The maximum error rate for light-skinned males, on the other hand, was less than 1 percent.
Face recognition works by matching the person being scanned against a database of facial images. In policing contexts, these databases can include passport and driver’s license photos or mugshots.
Subscribe to receive monthly updates by email about conferences, publications, and news from the field.
Have a question? We’re here to help. Visit the help center to get started.Support Center
University of Illinois Research Park
2001 South First Street, Suite 202
Champaign, IL 61820 USA