A previous post dealt with facial recognition technology and its privacy implications. What some members of the class may be unaware of is that certain law enforcement agencies in Canada have begun to use facial recognition technology, and others are planning to implement it. Notably, the Calgary police department have been using facial recognition technology since 2014. The Toronto police department have tested out their own system. Edmonton and Saskatoon are considering implementing facial recognition technology. Montreal would not confirm whether they use facial recognition technology or not, and Halifax, Winnipeg and Vancouver say they do not. Further, the RCMP claims that they do not use the technology.
While the RCMP is subject to the federal Privacy Act, which regulates to what extent federal entities can collect, use, disclose and retain personal information, there is currently no provincial or municipal legislation which specifically addresses facial recognition. This has led to the concern that provincial and municipal police forces have unfettered access to and use of individual facial recognition data, and citizens are unsure of how such data is being used. One of the main concerns appears to be, without regulation, there is nothing to prevent a private company from entering the facial recognition space and beginning to compare mug shots with social media pictures. As we have seen in recent years, private companies often do not have individual privacy in mind when dealing with personal data.
On the other hand, Sgt. Gordon MacDonald of the Calgary police department is calling for a national database to ease identification. MacDonald says that this would be advantageous over bifurcated local databases because of Canada’s transient population. A national database would allow officers in any part of the country to match images of mug shots anywhere in Canada.
Many are calling on legislators to introduce regulation before facial recognition technology becomes more prevalent in law enforcement. Concerns over the technology led the European Union to consider a 5-year moratorium on the technology, which they have since backed away from (https://www.ft.com/content/ff798944-4cc6-11ea-95a0-43d18ec715f5). The question has become whether and how legislators can protect individual privacy through regulation on facial recognition technology, and whether the technology itself is worth the risk to individual privacy. Privacy has become a hot button issue in recent years and governments appear to be struggling to keep up with new technology and its impacts on personal privacy.
Here is a link to the CBC article this post is based on: https://www.cbc.ca/news/canada/nova-scotia/facial-recognition-police-privacy-laws-1.5452749.
Here is a “Front Burner” podcast episode that discusses the issue in detail: https://www.cbc.ca/radio/frontburner/putting-the-brakes-on-facial-recognition-technology-1.5437020
This is a very interesting post. I was also reading about the controversies surrounding facial recognition software.
Clearview’s Facial Recognition App Is Identifying Child Victims of Abuse by Kashmir Hill and Gabriel J.X. Dance. https://www.nytimes.com/2020/02/07/business/clearview-facial-recognition-child-sexual-abuse.html?searchResultPosition=4.
This article discusses how it is being used in law enforcement (including in Canada) to identify the victims of child sexual abuse. Specifically, Clearview allows officers to id victims in videos and photos that would not otherwise be identified. The officers are uploading “probe images” of children that have been victim of abuse onto a database. This information is stored forever by default on a database or the users of Clearview can have the information wiped after 30 days. The existence of such a database of sensitive information raises certain privacy concerns, particularly when law enforcement is reluctant to discuss the specifics of how it is being used to protect its investigative techniques. That being said, the Department of Homeland Security in the US has stated that the system is downloaded on law enforcement networks to prevent sharing to third parties and the data is not being shared to private parties.
Another issue with using this software is the potential inaccuracy of the match, especially with regard to children whose faces change as they age and where children photos are often not included as data sets to develop the algorithm.
Facial Recognition Moves onto a New Front: by David Elby
https://www.nytimes.com/2020/02/06/business/facial-recognition-schools.html
This recent article outlines the manner in which public schools have introduced facial recognition in Lockport NY for security purposes.
The public schools support the use of this software to identify persons of interests entering the schools including registered sex offenders in the area, ex-staff banned from the premises and other “credible threats.” It is also used to scan for and detect guns coming into the school.
If a person of interest were identified entering the school, this would send a signal to security personnel to potentially confirm the match and then to school administrators to take action.
If a gun were identified, both the police and the administrators would be notified and the police would directly respond if the administrators were not present to.
This system has recently been implemented after revising its privacy policy to protect student data but concerns remain due to false-identification. The system has been shown to have a racial bias and falsely identifies African Americans and Asian Americans 10 to 100 times more than Caucasians. The school also seeking to extend its list of persons of interest to include suspended students because most school shootings are committed by students. However, according to a law professor specialized in education and policy, this has been said to be problematic insofar as it places excessive scrutiny on such youth which could have a negative effect on them by making them more likely to enter the criminal justice system.