Clearview AI, the U.S.-based facial recognition technology firm, is allowing Canadians to check whether their face appears in the company’s massive image database. Ontario’s privacy watchdog says residents should also be allowed to request their data be delet…

Clearview AI, the controversial U.S.-based facial recognition technology firm, is quietly allowing Canadians to check whether their face appears in the company’s massive image database.
Unlike residents of some other countries, however, Canadians do not appear to be eligible to ask for their pictures to be deleted.
Clearview AI first came under scrutiny earlier this year when it boasted
about collecting billions of photos from the internet to feed its facial recognition app.
The firm says the tool is meant to allow police to “identify perpetrators and victims of crimes,” but privacy advocates worry the technology could fall into the wrong hands, or lead to a dystopian future in which anyone can be identified within seconds whether they consent to facial recognition or not.
Several law enforcement agencies, from the RCMP
to Toronto
and Calgary
police, acknowledged their members had briefly used the software.
But observers questioned its legality. In February, federal and provincial privacy watchdogs opened an investigation
into the use of Clearview AI’s technology in Canada.
Clearview AI, the facial recognition tech firm, has confirmed my face is in their database. I sent them a headshot and they replied with these pictures, along with links to where they got the pics, including a site called “Insta Stalker.” <a href=”″></a>
Last week, a CBC News reporter submitted a headshot to the company by email and requested they provide all images of him found in the firm’s database. Clearview replied three days later, supplying a PDF file with 12 photos, including several duplicates. 
All pictures were closeups of the reporter’s face.Clearview listed where it had first found the images, including official CBC web pages, Twitter, and other services which appear to scrape social media profiles, such as a website called “Insta Stalker.”
Both Twitter and Facebook, which owns Instagram, have told Clearview to stop using images from their platforms for facial recognition.
How to see your pictures in Clearview’s database
“You have the right to request that Clearview AI provides you with copies of your personal data,” the firm’s website states
. It says to email the request to
, along with a headshot which will be used for the search.
But getting the pictures removed from the database isn’t quite as easy.
Clearview’s privacy policy says it’s possible to ask for personal data to be deleted, but only “under certain conditions,” depending on local data protection rules. Its website provides forms
for residents of various jurisdictions with privacy legislation in effect  such as California, Britain and the EU  to request their images be deleted.
In response to a series of questions from CBC  including whether the firm would comply if a Canadian user requests their data be deleted from Clearview’s database  the firm’s CEO, Hoan Ton-That, provided a one-line statement.
“We process privacy requests for opt-out and data access we receive from Canadian citizens,” he said. 
The “opt-out” option appears to suggest Canadians can get Clearview to stop selling their data to other companies, even though the firm itself says
it “will never share or sell user data.” A representative of the company did not respond to a request for clarification on what specifically the opt-out entails. 
Based on the definition
of the “right to opt-out” in the California Consumer Privacy Act, “a consumer shall have the right, at any time, to direct a business that sells personal information about the consumer to third parties not to sell the consumer’s personal information.”
Clearview should offer to delete: privacy watchdog
Ontario’s privacy watchdog said Clearview should give residents the option to have their images and data deleted. 
“This is particularly so, given that Clearview obtained people’s images without their consent,” Brian Beamish, the province’s information and privacy commissioner said in an emailed statement.
Clearview said it only collects publicly-available images and makes them accessible in a searchable format for its clients in law enforcement.
In May, the firm said it would no longer supply its technology to private companies not affiliated with law enforcement, according to court filings cited
by Buzzfeed.
Beamish added he’d become aware of at least 15 Ontario police services that had used facial recognition technology, “but are no longer using it.”
Representatives for the privacy authorities of Canada, Quebec and B.C. all declined to comment, citing their ongoing investigation into Clearview. Their Ontario counterpart is not involved in the inquiry.
Canadians don’t have a ‘right to be forgotten’
Michael Geist, a University of Ottawa law professor, said the case “highlights how Canadian privacy law has really failed to keep pace with some of these emerging challenges.”
A spokesperson for the Office of the Information and Privacy Commissioner of Alberta pointed out that under provincial legislation, Albertans do not have “an explicit right to erasure, which is contained in the European Union’s General Data Protection Regulation, for example.”
WATCH | Why experts are concerned about facial-recognition technology:
Several privacy watchdogs are launching a joint investigation into the creators of a controversial facial recognition technology.2:42
The right to erasure, also known as the right to be forgotten, has been the topic of protracted debate
in Canada. Last year, the European Court of Justice ruled the EU legislation does not apply beyond its borders.
The U.K.’s Information Commissioner’s Office explains
Europeans can withdraw their consent and request their data be deleted when a firm has been storing it.
“We just don’t have the same thing here when it comes to companies using our information,” said Kristen Thomasen, an assistant professor of law, robotics and society at the University of Windsor.
She said the Clearview case also highlights gaps in technology law, that so far have allowed the firm to claim it’s only gathering public data. And although users may have posted the images to social media, they likely didn’t agree to have it used for facial recognition used by police. 
Just because a picture is posted to Facebook, Thomasen cited as an example, “that doesn’t mean I’m consenting to have my face used to train an algorithm that can be used to over-police minority populations,” in reference to particular concerns for Black, Indigenous and racially marginalized communities. 
“I definitely don’t consent to that.”