Facial recognition in the era of the 2024 Olympics in Paris
Toward a surveillance society?
Image illustrating the data Facial recognition in the era of the 2024 Olympics in Paris

In 2021, it was announced that the government would not use facial recognition at the Olympics. However, the report of the Senate , published in May 2022, revives the debate about facial recognition in public space.

The recent incidents at the Stade de France have also been used by some political figures to demonstrate the urge to use facial recognition. In this regard, the mayor of Nice advocates facial recognition and goes as far as calling the CNIL “that dusty institution that prohibits the use of facial recognition”. These are regular sporting events, but it seems that police mismanagement is leading to a desire to increase security through facial recognition. But, because facial recognition raises a number of important issues, the question is : is it really an effective solution?

What is facial recognition?

Facial recognition is a biometric technique that identifies or authenticates a person based on their facial features. In a simplified way, identification answers the question “who are you” and authentication “are you Mr. or Mrs. X?”.

Biometrics is a broad term, as it is a technique that generally uses the permanent, unique characteristics of people, and then transforms them into biometric data. According to the GDPR (General Data Protection Regulation), “biometric data means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”.

Can the law put an end to facial recognition controversies?

The widespread use of facial recognition in the public space increases the controversy. We walk down the street and without realizing it, cameras are filming us without our knowledge. However, not every camera uses facial recognition. For instance, videoprotection simply films “the public highway and places open to the public” and video surveillance films “places not open to the public”. [1] . The European Data Protection Board (EDPB) gives an example to better understand this point: “A shop owner would like to customize its advertisement based on gender and age characteristics of the customer captured by a video surveillance system. This system does not generate biometric templates in order to uniquely identify persons, but instead just detects those physical characteristics in order to classify the individual then the processing would not fall under Article 9 (as long as no other types of special categories of data are being processed).” [2]

In addition to this issue, there is a lack of knowledge and awareness among citizens about facial recognition. It is important that citizens are informed in view of the risks that biometric data may represent for them. Indeed, biometric data are immutable and essentially connected to a person , i.e. they cannot be modified or isolated from whom they were collected. In this context, the GDPR qualifies “biometric data processed for the sole purpose of identifying a human being” as sensitive data, thus imposing specific processing conditions on controllers. The risks can be very high for the data subjects, for example in case of data theft or a security breach.

The law, therefore, provides a framework for the use of biometric data. Article 9 of the GDPR establishes the principle of prohibition of biometric data processing [3] for the sole purpose of identifying a natural person. Still, the article provides for exceptions to the rule.

The ban on processing biometric data can be easily circumvented. This is the case, for example, in the field of labor, where facial recognition or other biometric techniques could be employed for access control of certain premises. In this context, certain legal bases cannot be used for biometric data processing. This is the case for the contract between the employer and the employee, which cannot legitimize the use of biometric data. Similarly, the consent of the employee cannot be used either, as the hierarchical superiority of the employer would make it unbalanced and therefore legally invalid. However, the employer may rely on legitimate interest to carry out the processing of biometric data.[4]

However, the use of facial recognition seems to be becoming more widespread. So what are the issues?

The first issue is the balance between freedom and security

In the context of the Olympic Games, the Senate mentioned this balance. Video protection can be useful in case of detection of abandoned packages or suspicious movements in a crowd, but it can infringe on individual liberties. Therefore, facial recognition “requires an explicit legal basis” and it must be necessary for the purpose it is used for. In addition, it is important to balance the need to use such biometric systems with the rights of individuals. The persons concerned may see their rights and freedoms in danger, such as their freedom of movement

As we can see, the use of facial recognition is closely linked to democratic issues. The perennial issue can be summarized by the quote attributed to Benjamin Franklin, “A people willing to sacrifice a little liberty for a little security deserve neither, and will eventually lose both”. Wouldn't this be the case with the generalization of certain uses of facial recognition?

The second issue is the question of purpose creep and surveillance

The link between purpose creep and surveillance is easy to guess as purpose creep is understood as a change of purpose, i.e. what the data was collected for. Indeed, by diverting biometric data, one could do practically what one wants, without informing the person concerned. Of course, this is not legally allowed, but in practice it is frequently encountered: it leads to societies where the people are monitored and their rights are violated, as is the case in China, Russia, or Italy. [5] For example, in the last one, facial recognition is used in the context of the disembarkation of migrants, although the data protection authority has issued an unfavorable opinion due to the lack of legal basis, and the precedent creation for mass surveillance. [6]

All this biometric surveillance does not go uncontested. In June 2021, Amnesty International and 170 organizations called for a ban on this practice and stated that facial recognition leads to discrimination, threatens fundamental rights and is, therefore, dangerous. [7]

In its report, the French Senate announces 30 measures against the surveillance society, but in practice, are these measures really effective? The experimentation with facial recognition during the Olympic Games could aggravate the current problems and ultimately give rise to a surveillance society. This technology may have advantages, as long as one makes sure that the intention of the use is appropriate. However, we really need to consider all the risks of its implementation in our societies. Because, once implemented, it will be difficult, if not impossible, to turn back the clock.

  1. https://www.cnil.fr/cnil-direct/question/videoprotection-videosurveillance-cest-quoi-la-difference?visiteur=part ↩︎

  2. https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_201903_video_devices_en_0.pdf ↩︎

  3. Data processing is a broad term, encompassing, among other things, the collection, recording or storage of data ↩︎

  4. https://www.cnil.fr/fr/question-reponses-sur-le-reglement-type-biometrie ↩︎

  5. https://www.amnesty.org/en/latest/news/2021/06/amnesty-international-and-more-than-170-organisations-call-for-a-ban-on-biometric-surveillance/ ↩︎

  6. https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9575842
    https://www.europarl.europa.eu/doceo/document/E-9-2021-002182_EN.html#def2 ↩︎

  7. https://www.amnesty.org/en/latest/press-release/2021/06/amnesty-international-and-more-than-170-organisations-call-for-a-ban-on-biometric-surveillance/ ↩︎

Eloïse Quinzin

Thu Jul 21 2022

Sign up for our privacy newsletter