Facial recognition: key facts about the issues
Facial recognition is inaccurate and biased
Amazon’s Rekognition facial recognition software, which is used by police departments throughout the U.S., has been found to perform worse when identifying an individual’s gender if they were female or darker-skinned.2,3 This puts marginalized communities at risk.
In a test run by the ACLU, Rekognition software incorrectly matched 28 members of Congress to mugshots, identifying them as other people who have been arrested for a crime. The false matches were disproportionately of people of colour.4
These biases in the software mean that people of colour are more likely to be the victims of wrongful arrest.
More than 2,000 people in Cardiff, UK were wrongly identified as potential criminals during the 2017 Champions League final.5
Beyond the biases and errors present in the software, the actual deployment of facial recognition has also worryingly targeted low income neighbourhoods and communities of colour.
Early rollouts of 600 high definition cameras in public areas all around Detroit have been called ‘techno-racism’,6 while low-income and predominantly black apartment buildings in New York have been amongst the first targeted with facial recognition scanners for people to enter their homes.7
And in the UK, the Notting Hill Carnival, which celebrates Caribbean culture, was targeted with facial recognition cameras scanning all the people attending the street event.8
Facial recognition is unregulated in Canada - but it’s already being used.
The Personal Information Protection and Electronic Documents Act (PIPEDA) regulates the use of personal information by private enterprises. The Privacy Act does the same for government. But neither of these Acts specifically mention biometric data such as facial recognition.
Several police services, including Calgary and Toronto, have already been using facial recognition programs, which they claim are only being used to match crime scene footage to mugshot databases.9
Several other police services, including Montreal and Halifax, refuse to confirm or deny whether they are using facial recognition.
On the commercial side, the Office of the Privacy Commissioner is currently investigating the use of facial recognition technology at top shopping malls in Canada operated by Cadillac Fairview, who claim the technology is used to monitor traffic, as well as the age and gender of shoppers.10
The unregulated environment in which this is occurring means that there is no oversight, no accountability, and no transparency for people in Canada.
Facial recognition is a slippery slope
Limited use of facial recognition by law enforcement has typically led to greater and much broader rollouts of the technology in other countries around the world.
In the U.S., President Trump has issued an executive order requiring facial recognition identification for 100% of international travellers in the top 20 U.S. airports by 2021.11
In the UK, facial recognition is already being used at sports matches12, street festivals, protests,13 and even on the streets to constantly monitor passers-by.14
The slippery slope of facial recognition leads to real-time, nonstop surveillance, where daily activities such as shopping and travelling around your community, or activities like attending a protest or political event are monitored, recorded and stored. There is no way to know how this information could be used in the future, or who it is shared with.
Tracking technology does not end at simply mapping faces. Even more intrusive and personal technologies are on the horizon, such as automated facial emotion detection, skin texture recognition and even vein mapping.15
The databases facial recognition systems use are harmful to privacy and equality
Some facial recognition systems used by law enforcement use mugshot databases - but this carries major issues. Many mugshot databases include images of people who have been arrested but not found guilty.16 Additionally, because more people of colour are targeted by police, it increases the likelihood of their images being included in these databases.
Other systems take information from drivers licenses and passport photos. In fact, reports say that half of adults in the United States are now in facial recognition databases.17
In Canada, ICBC was prevented by the Office of the Privacy Commissioner from lending their facial recognition database, built from images of drivers license photos, to the police to help them find suspects from the Stanley Cup riots.18
Publicly-available photos from social media have also been used to train the AI of facial recognition systems. IBM has come under fire for scraping nearly a million photos from Flickr, without notifying or seeking consent from any of the individuals who uploaded them.19
For more info on facial recognition, see our blog post explaining the technology.20