- cross-posted to:
- privacy@lemmy.ml
- cross-posted to:
- privacy@lemmy.ml
Clearview AI offers its clients a system that works like a search engine for faces - users upload a photo and it finds matches in a database of billions of images it has collected.It then provides links to where matching images appear online.
In March, Clearview’s founder Hoan Ton-That said it had run nearly a million searches for US police, helping them to solve a range of crimes, including murders.
He also revealed its database contained 30 billion images scraped from the internet.
Critics argue that law enforcement’s use of Clearview’s technology puts everyone into a “perpetual police line-up”.
France, Italy and Australia had also taken action against the firm.
Overlaying semi-translucent white stuff on a woman’s face does not make for the most innocent looking thumbnail.
I have watched too much porn.
You and me both. I thought this thumbnail was something very different lmao
Explaining the decision James Castro-Edwards, data privacy lawyer from Arnold & Porter told the BBC that, “Clearview only provided services to non-UK/EU law enforcement or national security bodies and their contractors.”
“The appeal turned exclusively on the fact that Clearview’s customers were overseas national security and law enforcement bodies, and so shouldn’t be relied on as granting a blanket permission for such scraping activities more generally.”
And thats why you need a GDPR and enforce businesses to store their data on servers under European authority.
This is the best summary I could come up with:
A company which enables its clients to search a database of billions of images scraped from the internet for matches to a particular face has won an appeal against the UK’s privacy watchdog.
Clearview AI offers its clients a system that works like a search engine for faces - users upload a photo and it finds matches in a database of billions of images it has collected.
In March, Clearview’s founder Hoan Ton-That told the BBC it had run nearly a million searches for US police, helping them to solve a range of crimes, including murders.
In the past Clearview AI had commercial customers, but since a 2020 settlement in a case brought by US civil liberties campaigners, the firm now only accepts clients who carry out criminal law enforcement or national security functions.
Explaining the decision James Castro-Edwards, data privacy lawyer from Arnold & Porter told the BBC that, “Clearview only provided services to non-UK/EU law enforcement or national security bodies and their contractors.”
“The appeal turned exclusively on the fact that Clearview’s customers were overseas national security and law enforcement bodies, and so shouldn’t be relied on as granting a blanket permission for such scraping activities more generally.”
The original article contains 539 words, the summary contains 200 words. Saved 63%. I’m a bot and I’m open source!
Either the article’s image is a not-terribly-good mock-up of the actual facial recognition system doing feature recognition on someone’s face, or said system is identifying the model’s shirt collar as part of her face.
So is this the future?
I don’t see how this could be illegal if Google’s reverse image search gets a pass.