Whenever writing about face recognition technologies, particularly in the context of policing, I have raised the issue of the potential for false positives caused by unintentional biases embedded in the algorithms. Artificial intelligence tools, such as face recognition, are only as good as the algorithms they are based on, and it is very easy for the developers to unknowingly program their own biases into the algorithm, with very negative consequences:
False positives can mean that certain people are regularly stopped and potentially harassed by the police. Now imagine that the biometric engineers who set the algorithms are all from the same racial and ethnic groups, whether on purpose or not, their biases will be factored into the accuracy of the results. This will likely translate into minority groups taking the brunt of the false positives. For artificial intelligence and machine learning to be effective, it needs to be accurate at least 80% of the time. When that happens it will always be better than humans. But still, if we move to a system of Big Brother with ubiquitous cameras capturing our facial images 24/7 and the system is only 80% accurate, that leads to arguably an unbearably high threshold for potential abuse. Democracies are supposed to accept some criminals getting away with crime in exchange for the innocent not being locked up. It’s the authoritarian regimes who place law and order above the protection of the innocent.
Am I exaggerating? The American Civil Liberties Union (the ACLU) doesn’t think so. It recently ran a test of Amazon’s Rekognition — which Amazon has been aggressively marketing to police forces — by running the face recognition tool on the faces of members of the U.S. Congress against a sample of 25,000 mugshots. The results?
. . . according to the ACLU’s report, the technology is far from perfect. Rekognition incorrectly identified more than two dozen lawmakers as people who have been arrested for a crime, and the false matches were disproportionately people of color, the ACLU said. Six members of the Congressional Black Caucus, including noted civil rights leader Rep. John Lewis, were each identified as a match for a mugshot in the Rekognition database.
This doesn’t mean that we should disregard the huge positive potential for biometrics, but we need to be smart about how and when it is used.
“These results are consistent with a broader pattern of results from the machine learning literature,” Kroll told BuzzFeed News. “Not only does face recognition of large sets of individuals remains difficult to do accurately, face recognition systems have been shown to perform much less well for women, people of color, and especially women of color.
“It is important when fielding advanced computer technologies to do so responsibly,” Kroll continued. “These results show that Rekognition shouldn’t be used for some applications in law enforcement as it is currently.”
Face recognition works best with small sets of people, where it is used for the benefit of consumers and where consumers have the opportunity to opt out of the service. It is definitely not reliable in the context of law enforcement where decisions about you are being made without your knowledge, consent or control.
Unfortunately when Amazon, or other companies, get it wrong, consumers lose confidence in the new technology. That negatively affects the market perception for tools that have lots of useful applications that – when designed with consumer’s best interests at heart — can better our lives.