PimEyes allows the general public to enter the world of surveillance. But will it prove difficult to regulate?
Imagine you happened to be walking through a street during a street fight that occurred years ago. A security camera captured the moments of the incident, and a news outlet uses screenshots of the footage. You don’t want to be associated with the incident, but did not mind being in the photos since you were wearing sunglasses. You would not be recognised anyway, you thought.
But now, for $29.99 a month, anyone can dig out these photos among more than 900 million images in the internet graveyard, and find more photos — even ones that you were not aware of being online.
PimEyes, the provider of this face recognition tool, is not the first artificial intelligence (AI) engine that scans faces, but is the simplest way to make such a search online. Its results are extremely accurate and advanced, and is able to detect side profiles and faces that are partly covered.
Another highly controversial AI face recognition tool, Clearview AI, provides more extensive results including photos that were shared on private social media accounts, but is only accessible to law enforcement.
While face tracking tools are accused of legitimising mass surveillance, they have become increasingly widespread among authorities and companies that want to boost security.
PimEyes, on the other hand, also allows the general public to enter the world of surveillance.
Accusation of ‘extortion’
PimEyes is also accused of exploiting the vulnerable people who regret images that they hoped would stay undiscovered. When clicked on the pictures, the tool offers ‘PROtect plans’, packages varying from $89.99 to $299.99 per month to exclude such photos from public results.
Cher Scarlett, one computer engineer, told the New York Times that she was shocked to see that her explicit photos were made online without her consent when she tried PimEyes a few months ago.
“I had no idea up until that point that those images were on the internet,” she said. Worried about the reaction of people, she ended up buying the most expensive monthly package to hide these images.
In a Medium post, Scarlett told her story about how she put up an agonising fight to delete those images before buying PimEyes’s monthly package.
“The only reason I need PimEyes is because PimEyes exists. That’s extortion,” she said.
Giorgi Gobronidze, the current owner of the company, objects to such accusations. He told NYT that the website also provides a free tool to hide images and PimEyes had refunded Scarlett for the $299.99 plan in April.
The free tool, however, is not easy to find on the website compared to paid plans that are advertised.
The tool also offers a free opt-out tool to allow users to exercise their “right to have any and all of your data removed” from their system. Scarlett told NYT that she received an email stating that potential results containing her face were removed from the system. But journalists have discovered that her explicit images continued to pop up in the search results around a month after the email.
The opt-out option only removes the images that were already indexed by PimEyes. It means they may pop up again when another search is conducted. Those who want their photos to be removed have to regularly apply for an opt-out.
The opt-out process "sets people up to fight a losing battle," Woodrow Hartzog, a professor of law and computer science at Northeastern University, told CNN.
“Because this is essentially like playing whack-a-mole or Sisyphus forever rolling the boulder up the hill,” he said.
"It will never stop…And we know from experience that the people who will suffer first and suffer the hardest are women and people of colour and other marginalised communities for whom facial-recognition technology serves as a tool of control over,” he said.
A German data protection agency last year launched an investigation into PimEyes over its processing of biometric data. The investigation is yet to be finalised.