Data privacy is more than facial recognition – but it’s harder to see in the mirror
The reality of facial recognition got a huge boost when news broke that Clearview AI had accumulated around 3 billion photos from across the web and is using them to fuel its facial recognition service. The privacy angle got another boost when details about their unprotected cloud storage site was discovered. Either way, this is just the visible face of a massive surveillance economy around you.
Clearview’s service is now reportedly being used by around 600 law enforcement agencies and other companies across the United States and has been used to identify everyone from shoplifters to murder victims. By tapping into social media photos, Clearview also has a better angle on its photos than the images from a higher mounted surveillance camera. An annual license to the program is $10,000 and the company says that its accuracy rate is around 75%.
The future’s so bright, I’ve gotta wear AR shades
While facial recognition is not brand new in the Western world, the scale of this operation is. Clearview boasts of 3 billion images in its collection – far above the 411 million in the FBI files. The report raises at least as many questions and ethical issues as it answers. First of all, Clearview’s billions of photos have been scraped from a variety of web sites and social media such as Facebook. This is violating the FB terms of services and one can assume that many of the 3 billion photos are from people that have no idea of how their portraits are being used. In addition, additional photos uploaded by users into the system stay there – increasing the pool of photos and reducing individual privacy at the same time.
The intrusiveness of Clearview could also increase in the future, with a planned program variant expected to connect with augmented reality glasses. This could give the wearer not just the names of people seen on the street but also real time information on where they live and their acquaintances.
Maybe yes, maybe no for facial recognition
Clearview’s story came as the European Union is considering putting a 3-5 year moratorium on the use of facial recognition software in public spaces. The EU’s potential move is supported by Google but, somewhat surprisingly, not by Microsoft. Don’t forget, it is also possible to have a DIT facial recognition project with a few hundred dollars, CCTV feeds, and an Amazon account.
Clearview is a sign that – like it or not- facial recognition is no longer a theoretical concept but is being applied even though it has some real biases and accuracy issues with some darker-skinned ethnic groups. There are many issues over when this technology should be applied. Human rights, ethnic profiling, looking into what it is used for. Then there is the question of individual privacy – just who gave them the right to use those billions of pictures to train their AI and what about the privacy of those billions of people?
Your face is just the start
Facial recognition gets attention because it’s based on human faces – just like what we see in the mirror and during our daily lives. But this is simply the most recognizable aspect of our mass surveillance world. Focusing solely on facial recognition, misses the bigger picture of how much data is being collected about you, how it is being combined and resold, and then how it is being used.
Three privacy points to consider
Focusing on facial recognition is a mistake, pointed out Bruce Schneier, technologist and cryptography specialist. It is just one of many forms and technologies that are identifying and tracking our daily activities. A better way to look at online privacy is to divide it into three major points of identification, correlation and discrimination.
- Identification – Facial recognition is only one of many technologies used to identify you. Other physical traits that can be measured are your walk, voice, heartbeat – even your irises. The device in front of you can also broadcast your location in the metadata in posted photos, your IP address, the fingerprint of the device monitor, and the device MAC address. That’s even without diving into unencrypted message contents.
- Correlation — It’s not just one face, one app, or one location – it’s a strand of data. These individual strands are being woven together to get a better picture of who we are, how we behave, and what we like. It’s not just the big tech players such as Amazon, Facebook, and Google with trackers doing this. There is an industry of data brokers that buy, sell, and combine data about you – and you have no knowledge or control over this. It’s even becoming clear that data anonymization is easily broken– making that data collected about you and your activities very personal. Most people can’t just move to Vermont just to expose them either.
- Discrimination – Distinguishing between people is the goal of this data collection – and that’s not always bad. But when there is discrimination based on age, gender, and race – it is usually wrong and not even legal. It is a big unanswered question how data and algorithms are being used to sort and discriminate between people.
Time for the big talk about the birds, bees, forests, and surveillance
Facial recognition is unsettling – and it should be. However, this is only one little part of the larger surveillance environment surrounding us. Yes, it needs to be regulated – and so do many other aspects of data collection. Are we ready to use a VPN to shut out trackers from eavesdropping on us? And while we are at it, it’s time to look at how data brokers are able to collect and trade our data. Finally, what about discrimination? What groups are we willing to let companies or governments group us into — or out of?
Yes, it’s time to take some steps in a more private direction.