Posted in: Tattle, Technology

It’s about time facial recognition tech firms took a look in the mirror | John Naughton

Clearview AI was fined for using internet-sourced images of UK residents in its database – but not before police forces used its service

Last week, the UK Information Commissioner’s Office (ICO) slapped a £7.5m fine on a smallish tech company called Clearview AI for “using images of people in the UK, and elsewhere, that were collected from the web and social media to create a global online database that could be used for facial recognition”. The ICO also issued an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet and to delete the data of UK residents from its systems.

Since Clearview AI is not exactly a household name some background might be helpful. It’s a US outfit that has “scraped” (ie digitally collected) more than 20bn images of people’s faces from publicly available information on the internet and social media platforms all over the world to create an online database. The company uses this database to provide a service that allows customers to upload an image of a person to its app, which is then checked for a match against all the images in the database. The app produces a list of images that have similar characteristics to those in the photo provided by the customer, together with a link to the websites whence those images came. Clearview describes its business as “building a secure world, one face at a time”.

Back to Top