As police increasingly use facial recognition technology, demands for regulation grow

Some police services in Canada are using facial recognition technology to help solve crimes, while other police services say human rights and privacy concerns prevent them from using this powerful digital tool.

It’s this uneven application of technology — and lax rules governing its use — that has legal and AI experts calling for the federal government to set national standards.

“Until we have a better understanding of the risks of using this technology, there should be a moratorium or set of restrictions on how and where this technology can be used,” Kristen said Thomasen, professor of law at the University of British Columbia. .

Additionally, various regulations regarding emerging biometric technologies have created a situation in which the privacy rights of some citizens are better protected than others.

“I think the fact that different police forces are taking different actions raises concerns about injustice and how people are treated across the country, but it also highlights the importance of some federal action that needs to be taken “, did he declare. .

Facial recognition systems are a form of biometric technology that uses AI to identify people by comparing images or videos of their faces — often captured by security cameras — with images of their faces in a database. The technology has become a controversial tool in the hands of police.

In 2021, the Office of the Privacy Commissioner of Canada found that the RCMP had violated privacy laws by using the technology without the public’s knowledge. That same year, Toronto police admitted that some of its officers used facial recognition software without notifying the police chief. In both cases, the technology was provided by the American company Clearview AI, whose database consists of billions of images taken from the internet without the consent of the people whose images are used.

Last month, police in York and Peel, Ont., announced they had begun deploying facial recognition technology provided by French multinational Idemia. In an interview, Const. Kevin Nebrija said the tool “helps speed up investigations and identify suspects more quickly,” adding that in terms of privacy, “nothing changes because security cameras are everywhere.”

But in neighboring Quebec, Montreal police chief Fady Dagher said police would not adopt biometric identification tools without debate on issues ranging from human rights to privacy.

“It’s going to take a lot of discussion before we think about implementing it,” Dagher said in a recent interview.

Nebrija stressed that his ministry is consulting with Ontario’s privacy commissioner for best practices, adding that any footage police obtain will be “obtained legally,” either with the cooperation of the security camera owner or by obtaining a court order for the footage.

And while York police insist their officers will turn to law enforcement, Kate Robertson, a senior researcher at the University of Toronto’s Citizen Lab, said Canadian police typically do the opposite.

Since the revelations that Toronto police used Clearview AI between 2019 and 2020, Robertson said he “still knows of no police service in Canada that has received prior approval from a judge to use the technology.” facial recognition in its investigations.

Robertson said court authorization, usually in the form of a warrant, represents “the gold standard for privacy in criminal investigations.” It ensures that facial recognition tools, when used, are balanced with the rights to freedom of expression, freedom of assembly and other rights enshrined in the Charter.

While the federal government does not have jurisdiction over provincial and municipal police forces, it could amend the Criminal Code to incorporate legal requirements for facial recognition software in the same way it updated the law to address voice recording technology that could be used for surveillance.

In 2022, the chairs of Canada’s federal, provincial and territorial privacy commissions have called on lawmakers to establish a legal framework for the appropriate use of facial recognition technology, including by empowering independent oversight bodies , prohibiting mass surveillance and limiting the length of time images can be kept. stored in databases.

Meanwhile, the federal Department of Economic Development said Canadian law “potentially” regulates the collection of personal information by businesses, under the Personal Information Protection and Electronic Documents Act, or PIPEDA.

“If, for example, police forces, including the RCMP, outsource activities using personal information to private companies engaged in commercial activities, then those activities could potentially be regulated by PIPEDA, including services related to facial recognition technology,” the department said.

The Quebec Provincial Police also has a contract with Idemia, but it has not explained exactly how it uses the company’s technology.

In an emailed statement, police said: “An automated facial comparison system was not used to verify the individual’s identity. This tool is used for criminal investigations and is limited to data sheets of individuals whose fingerprints have been taken under the Criminal Identification Act.”

Ana Brandusescu, an expert on AI governance, says Ottawa and the country’s police forces have ignored calls for better governance, transparency and accountability in the purchase of facial recognition technology.

“Law enforcement is not listening to academics, civil society experts, people with lived experience, people who have been directly harmed,” he said.


This report by The Canadian Press was first published June 30, 2024.

Ferdinand Stevens

"Travel nerd. Social media evangelist. Zombie junkie. Total creator. Avid webaholic. Friend of animals everywhere. Future teen idol."

Leave a Reply

Your email address will not be published. Required fields are marked *