Four of Canada’s privacy commissioners have denounced the controversial facial recognition company Clearview AI.
The software firm has been used by law enforcement agencies around the world – including Toronto – allowing them to follow up on potential suspects with Clearview’s massive profile database. But Clearview’s scraping of billions of images of Canadians from across the internet represented mass surveillance and was a clear violation of their privacy rights, the commissioners said in a report issued this morning.
“What Clearview does is mass surveillance and it is illegal,” federal privacy commissioner Daniel Therrien wrote. “It is completely unacceptable for millions of people who will never be implicated in any crime to find themselves continually in a police lineup. Yet the company continues to claim its purposes were appropriate, citing the requirement under federal privacy law that its business needs be balanced against privacy rights. Parliamentarians reviewing [the proposed] Bill C-11 may wish to send a clear message, through that bill, that where there is a conflict between commercial objectives and privacy protection, Canadians’ privacy rights should prevail.”
The joint investigation by the Office of the Privacy Commissioner of Canada, the Commission d’accès à l’information du Québec, the Office of the Information and Privacy Commissioner for British Columbia and the Office of the Information and Privacy Commissioner of Alberta concluded that the New-York-based technology company violated federal and provincial privacy laws.
In a news release, they stated Clearview AI’s technology allowed law enforcement and commercial organizations to match photographs of unknown people against the company’s databank of more than three billion images., These images included adults and children for investigation purposes without their knowledge or consent. Commissioners found that this creates the risk of significant harm to individuals, the vast majority of whom have never been and will never be implicated in a crime.
In addition, the report noted that Clearview collected, used and disclosed Canadians’ personal information for inappropriate purposes, which cannot be rendered appropriate via consent.
When presented with the investigative findings, Clearview argued that:
- Canadian privacy laws do not apply to its activities because the company does not have a “real and substantial connection” to Canada.
- Consent was not required because the information was publicly available.
- Individuals who placed or permitted their images to be placed on websites that were scraped did not have substantial privacy concerns justifying an infringement of the company’s freedom of expression.
- Given the significant potential benefit of Clearview’s services to law enforcement and national security and the fact that significant harm is unlikely to occur for individuals, the balancing of privacy rights and Clearview’s business needs favoured the company’s entirely appropriate purposes.
- Clearview cannot be held responsible for offering services to law enforcement or any other entity that subsequently makes an error in its assessment of the person being investigated.
The four commissioners rejected these arguments.
“They were particularly concerned that the organization did not recognize that the mass collection of biometric information from billions of people, without express consent, violated the reasonable expectation of privacy of individuals and that the company was of the view that its business interests outweighed privacy rights,” the commissioners said.
Clearview AI and the RCMP
On the applicability of Canadian laws, the commissioners noted that Clearview AI collected the images of an unknown number of Canadians and actively marketed its services to law enforcement agencies in Canada. The RCMP became a paying customer. A total of 48 law enforcement and other organizations across the country used the application for a time.
Shortly after the investigation began, Clearview agreed to stop providing its services in Canada. It stopped offering trial accounts to Canadian organizations and discontinued services to its only remaining Canadian subscriber, the RCMP, in July 2020.
The privacy authorities recommended that Clearview stop offering its facial recognition services to Canadian clients; stop collecting images of individuals in Canada; and delete all previously collected images and biometric facial arrays of individuals in Canada.
However, the commissioners say Clearview disagreed with the findings of the investigation and “did not demonstrate a willingness to follow the other recommendations.
“Should Clearview maintain its refusal, the four authorities will pursue other actions available under their respective Acts to bring Clearview into compliance with Canadian laws,” the commissioners concluded.
The full report notes that in disagreeing with the findings, Clearview alleged an absence of harm to individuals connected to its activities. “In our view, Clearview’s position fails to acknowledge: (i) the myriad of instances where false, or misapplied matches could result in reputational damage, and (ii) more fundamentally, the affront to individuals’ privacy rights and broad-based harm inflicted on all members of society, who find themselves under continual mass surveillance by Clearview based on its indiscriminate scraping and processing of their facial images.”
At a news conference, Therrien said because the company scraped images across the internet, not even it knows how many in its database of 3 billion images are Canadians. That, he acknowledged, creates a problem in asking that the images of Canadians be removed. Clearview AI told the four commissioners it “would be willing to take steps, on a best efforts and without prejudice basis, to try to limit the collection and distribution of the images that it is able to identify as Canadian.”
The report highlights the inability of the federal privacy commissioner to order Clearview AI to remove the images. The current privacy law that covers commercial firms, the Personal Information Protection and Electronic Documents Act (PIPEDA), doesn’t give Therrien order-making powers. However, he can go to the Federal Court to get a compliance order, and Therrien told reporters he’s considering that. The proposed C-11 would give the federal commissioner order-making powers.
The three provincial privacy commissioners do have order-making powers, and all three told reporters they are considering their options.
In a statement, Doug Mitchell, an attorney for Clearview AI, said the company’s technology is not currently available in Canada and it does not operate in Canada. The company maintains that the public information it collects from the internet “is explicitly permitted under PIPEDA.”
“The Federal Court of Appeal has previously ruled in the privacy context that publicly available information means exactly what it says: “available or accessible by the citizenry at large,”, he said. “There is no reason to apply a different standard here. Clearview AI is a search engine that collects public data just as much larger companies do, including Google, which is permitted to operate in Canada.”
The commissioners’ position was best reflected by a statement to reporters by B.C information and privacy commissioner Michael McEvoy, who said “when I put my information on Facebook or some other platform I do it for a particular purpose, and what happens here is these companies have disregarded that purpose and without authority have taken that information” for a commercial application.
The commissioners also note that Facebook, Google, Twitter, YouTube and LinkedIn have filed letters with Clearview AI demanding it stop scraping images from their sites. That, they say, is a violation of their terms of service with users. Clearview AI says its right to scrape data is lawful under the U.S. Constitution’s freedom of speech provision.
Meanwhile, the Office of the Privacy Commissioner of Canada’s investigation into the RCMP’s use of Clearview AI’s facial recognition technology continues.
According to the report, one objection raised by Clearview AI is that none of the privacy commissioners had jurisdiction to investigate the company because, as a U.S.-based firm, it doesn’t have a clear connection here. The commissioners disagreed, noting “it actively marketed its services to Canadian organizations through promotional material, testimonials from Canadian law enforcement professionals, and agency-specific presentations and trials. Furthermore, Clearview publicly declared Canada to be part of its core market in statements to the media.”
In a statement, Diane Poitras, President of the Commission d’accès à l’information du Québec, said that “Clearview’s massive collection of millions of images without the consent or knowledge of individuals for the purpose of marketing facial recognition services does not comply with Quebec’s privacy or biometric legislation. The stance taken by Clearview that it is in compliance with the laws that apply to it underscores the need for greater oversight of the use of this technology as well as providing regulatory authorities with additional tools of deterrence like those proposed in [Quebec’s] Bill 64.”
Jill Clayton, Information and Privacy Commissioner of Alberta, said that as the use of facial recognition technology expands, significant issues around accuracy, automated decision making, proportionality and ethics persist. “The Clearview investigation shows that across Canada we need to be discussing acceptable uses and regulation of facial recognition. Regulation would not only assist in upholding privacy rights, it would provide much-needed certainty to all organizations thinking about using or developing the technology.”
“Our investigation reveals the vast amount of personal information collected without people’s knowledge or consent,” said McEvoy. “It is unacceptable and deeply troubling that a company would create a giant database of our biometric data and sell it for profit without recognizing its invasive nature. The results of our work also point to the need to strengthen our privacy laws to properly protect the public.”
In an email, privacy lawyer Barry Sookman of the McCarthy Tetrault law firm said the report’s conclusions aren’t suprising. There were prior precedents in which Canadian courts and the office of the privacy commissioner applied our privacy law extraterritorially using what is called the substantial connection test, he said. While there is an exemption in PIPEDA for collecting personal data like pictures without consent in certain circumstances, he added, it is not surprising that the privacy commissioners found the exception didn’t apply to Clearview.
Separately, the federal commissioner’s office, along with provincial counterparts, are developing guidance for law enforcement agencies on the use of facial recognition technologies. Guidelines for consultation are expected to be released in the spring.
Despite opposition from privacy advocates to facial recognition, privacy commissioners around the world aren’t opposed to the idea of the technology. In October 2020, global privacy commissioners passed a resolution acknowledging it can benefit security and public safety. However, the resolution also asserts that facial recognition can erode data protection, privacy and human rights because it is highly intrusive. It also enables widespread surveillance that can produce inaccurate results. The resolution says organizations should ensure that facial recognition technology cannot be used where the purpose can reasonably be achieved by less intrusive means.