The dark side of facial recognition technology

99

You can change your name or nickname, delete or edit your social accounts, but your face will always be the same. Facial recognition helps us solve many problems  – and at the same time creates many new ones. Today I will discuss the threats that arise from the popularity of these systems around the world.

  1. Loss of the right to privacy – on a global scale

The FBI officially maintains the NGI (Next Generation Identification) system – a database of photos of people accused or sentenced in civil or criminal proceedings. What’s wrong with that?

Well, in May, the US Office of Government Control carried out an audit of the Federal Bureau of Investigation . It turned out that his database contained 412 million photos, including people who had never been the subject of any investigation. The FBI even has a separate Facial Analysis, Comparison and Evaluation ( FACE ) unit.

The study revealed that FBI officials, in consultation with several states, gained access to photos on driving documents and included with passport and visa applications, as well as photos of suspects and convicts. The database also contained photos of foreigners – potentially around 100 million people.

The FBI uses facial recognition extensively to conduct investigations. Certainly, this approach pays off. But here the situation is much more complicated; Facial recognition technology is still young and imperfect, and the FBI system is no exception: it exhibits racial bias with an accuracy of 80% -85% . Meanwhile, the FBI has deliberately concealed  the scale of its use of facial recognition, contrary to the requirements of the Privacy Impact Assessment.

It is worth adding that the Moscow City Council and Russian law enforcement agencies are constantly developing appropriate solutions and preparing to implement FaceN technology (the creators of this system are also the authors of the code for FindFace  – a service that helps people find others based on their photos). These new systems will be linked to hundreds of thousands of security cameras in Moscow.

According to a report by the Russian website Meduza, there is no similar system in any city in the world. The algorithm can compare people on the street with a criminal database, but that’s not all. It can also detect individuals in any part of the city and find their social network accounts, which usually contain a lot of personal information.

It should also be mentioned here that earlier this year the Russian Senate forced Russian courts to treat the photos and videos as legal evidence. Previously, the decision to do so belonged to the court.

  1. Abuses by law enforcement agencies

Facial recognition is not infallible. It is known that people who deal with them abuse them sometimes. For example, in August, the New York Times  reported that San Diego police had collected photos of guilty and innocent people without their consent.

27-year-old African American Aaron Harvey, living in San Diego, accused the police of prior prejudice. Harvey lives in an area with one of the highest rates of violence. This is probably why the police stopped him more than 50 times, suspecting him of gang membership. When he refused to let the officer take a photo of him, he was told that his will had nothing to do with it.

In 2013, the Boston authorities also tested the facial recognition system . It was linked to surveillance cameras that secretly scanned people’s faces at concerts and other outdoor events. At the end of the testing period, the project was abandoned for ethical reasons. But Boston is just a drop in the ocean, as the practice is used all over the world: facial recognition systems are already used extensively by government agencies.

  1. Corporations that spy on everyone

Organizations have face databases that are much larger than the FBI’s collection. At the top of the list are social networks: Facebook, Instagram (owned by Facebook), Google (with its Google+), nk.pl and many other social networking sites. Most of these companies have their own facial recognition solutions that are constantly being developed and improved.

Currently, Microsoft is working on a similar technology for the FamilyNotes application that will allow the software to distinguish between users using a camera built into a laptop or tablet. Microsoft is developing one of the most popular operating systems in the world, and this application will noticeably complement the company’s face base.

Facebook’s facial recognition system is one of the most accurate in the world. The company quietly launched  the tool in 2012, turning it on by default for most users. Later, she has had to deal with dozens of lawsuits – and the number continues to increase ; Google, in turn, is also facing a court case for similar actions . As a result of this situation, Facebook had to turn off face recognition  in some regions.

At this point, I must say that Facebook takes a one-sided approach to this: for example, there are no articles in its knowledge base on how to turn off facial recognition – and it’s not a one-click action .

Even if you don’t use any social network (or if you don’t put your real photos on them), your face may still be in the social media database. Last year, a Chicago citizen sued  Shutterfly’s photobook service because it had added a photo of him to its database without his knowledge. Someone (possibly a friend) uploaded and signed a photo of him to Shutterfly.

  1. Anyone can find you

The facial recognition system, which is accessible to everyone, can be used for leniency or justice. For example, this year two young men set fire to a building’s lobby in St. Petersburg. In the end, the pyromaniacs demolished the blame in the same building. The fun of these two people was recorded by cameras placed in and near the elevator.

When the local police refused to open a criminal case, the tenants of the house took the matter into their own hands: they obtained pictures of the cheaters' faces and used FindFace to find them on social networks.

The amateur detectives informed the police of their findings, and the young men were accordingly charged. In a report by Russian Rhine TV, one of the tenants said that they had enough data and evidence to send a message to fellow hooligans, as well as inform their universities and employers about the incident.

The people of St. Petersburg showed great patience in asking the police for help, but not all Internet users would do so. And if there are wants, there is a way – to force people. If you’re familiar with FindFace, you probably know its most infamous use case: Some people have used it to search for pornographic actresses on the Web . The trolls searched for women’s profiles on social networks and sent shameful messages to their friends and relatives along with relevant photos.

FindFace founder Maxim Perlin  believes that people today literally have to pay to protect their privacy . In a TV interview, he said that people who want to wipe their FindFace data need to buy a premium account. A month of privacy costs around $ 8.

  1. The fine line between safety and disaster

Many experts are convinced that biometrics will replace passwords and make the world even safer. So in the future, people will allow systems to scan their iris, fingerprints, and even their face, eliminating the need to enter complicated symbol combinations.

Microsoft is in the process of creating a  technology that will allow users to authorize themselves with selfie photos. NEC is researching whether facial recognition can be used to secure electronic payments. MasterCard is working on a selfie identification system that allows users to send money without passwords.

We’ve already written  about the cons of dactylography , so now I’ll focus on the weaknesses of facial recognition technology. They can be compared to the latest achievements in 3D printing: today you can print a really realistic copy of someone’s face . Developers of new identification systems will have to take this into account if they want to create truly safe products.

For example, MasterCard and Google ask users to blink – a simple action that prevents fraudsters from fooling the system with a 3D printed face or a simple photo. Unfortunately, Google’s product is failing –  some people have managed to bypass security  using a simple animation. The MasterCard system is under development, so it is not yet known whether it will be possible to cheat it in the same way.

„A colleague from work printed my face in 3D” – photo from Imgur

  1. Don’t share your face with anyone

You may have heard of Anaface, a site that analyzes a face photo and ranks the attractiveness level. The main criterion of the website is symmetry – a questionable standard, don’t you think? For example, Angelina Jolie   only scored 8.4 out of 10. But page accuracy is not her only problem.

First, the owners of Anaface admit that they launched this project to encourage people to think about plastic surgery. Well, at least they admit it.

Second, the site’s terms and conditions are written in a complicated language and quite unclear. It is displayed in a very small window, so the user has to scroll for a long time to read the over 7,000 words in small print. Probably many people do not read the information that by uploading photos to the website, users are giving its owners a non-exclusive, transferable and sublicensable, free worldwide license to use. Simply put, the service can sell people’s photos uploaded without any obligation to pay the rightful owner of the photo.

In turn, users undertake to upload only their own photos; you may not post, upload, display or share content that includes videos, sound photos or images of another person without their consent (and in the case of a minor, their legal guardian). The terms of use of the website also contain unclear comments on privacy and the possibility of removing photos after registering a user account – but this cannot be done by anyone on Anaface, because the website does not allow it.

Everyone collects photos: governments, corporations and companies, and even ordinary people. Today, anyone can use and abuse facial recognition systems – and we can only try to hide from them .