• No results found

The future of privacy protection in the European Court of Human Rights case-law in the light of live facial recognition technology

N/A
N/A
Protected

Academic year: 2021

Share "The future of privacy protection in the European Court of Human Rights case-law in the light of live facial recognition technology"

Copied!
46
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

1

The future of privacy protection in the European Court of Human

Rights case-law in the light of live facial recognition technology

(Master Thesis)

Anastasiya Zhyrmont

konovalova.anastasiya.12@gmail.com Student Number: 12699688

L.L.M. in International and European Law: Public International Law University of Amsterdam, Amsterdam Law School

(2)

2

TABLE OF CONTENTS

TABLE OF ABBREVIATIONS ... 3

CHAPTER 1. INTRODUCTION ... 4

1.1. Background and research question ... 4

1.2. Methodology and thesis structure ... 6

CHAPTER 2. WHAT FEATURES OF LFRT DIFFER THIS STATE-OF-THE-ART TECHNOLOGY FROM OTHER MASS SURVEILLANCE PRACTICES, RECOGNISED AND JUSTIFIED BY EUROPEAN CASE-LAW? ... 8

2.1. Distinctive features of LFRT as an emergent mass surveillance system ... 8

2.2. Mass surveillance in the ECtHR case-law... 12

CHAPTER 3. REASONABLE EXPECTATION OF PRIVACY IN THE DIGITAL AGE: DOES THE USE OF LFRT FORM AN INTERFERENCE WITH ARTICLE 8 (1) ECHR? ... 16

3.1. The reasonable expectation of privacy doctrine: the common law approach... 16

3.2. Scope of Article 8 (1) ECHR and the European Court ... 19

3.3. Does LFR technology infringe the right to privacy? ... 20

3.3.1. From CCTV footage to processing of biometric information ... 20

3.3.2. Storing data as a precondition to an interference with Article 8 ECHR ... 22

3.3.3. Identification with precision as a precondition to an interference with Article 8 ECHR ... 23

CHAPTER 4. ASSESSMENT OF LFRT DEPLOYMENT BY THE ECTHR: POSSIBLE OUTCOMES ... 26

4.1. Is LFRT operated in a legal vacuum? Establishing criteria of “the accordance with law” test in the ECtHR’s mass surveillance cases ... 26

4.2. Legitimate aim ... 29

4.3. Is LFRT able to protect the legitimate aim pursued? ... 29

4.4. The privacy/security conflict in the practice of the ECtHR: the width of State’s margin of appreciation ... 30

4.5. ‘Strict necessity’ test in the bulk interception cases ... 32

4.6. “Less restrictive means” doctrine in the realm of LFRT: proportionality test ... 33

CHAPTER 5. CONCLUSION ... 37

(3)

3

ABSTRACT

Live facial recognition technology is a software that allows to establish one’s identity by processing individual’s facial features. Deployed at a large scale it may gather and retain bulk of personal data, tracking one’s movements and/or notifying human operators about person’s presence in certain locations. Despite its obvious value for crime investigation and prevention purposes, the usage of the LFRT raises several human rights concerns, including the alleged interference with the right to privacy. In 2019 the deployment of the technology was for the first time challenged in national courts. Popova v Moscow City Police and R (Bridges) v CCSWP and

SSHD cases that justified the use of the software in the Russian Federation and UK are clear

examples of strategic litigation that have a good chance to end up before the European Court of Human rights after the exhaustion of local remedies by the applicants. Since the European Court has never dealt with LFRT claims before, the purpose of the present thesis is to make best possible predictions regarding its possible assessment of the software usage as for compliance with Article 8 ECHR. The applicability of the approaches developed in earlier mass surveillance Court’s proceedings and the necessity of new standards formation would be examined with the help of the ECtHR’s case-law analysis and various scholarly publications’ research.

Key words: live facial recognition technology, right to privacy, reasonable expectation of privacy,

the European Court of Human rights, watchlist, mass surveillance, biometric processing, CCTV.

TABLE OF ABBREVIATIONS

• AI: Artificial Intelligence

• CCTV: Closed-Circuit Television • COVID19: 2019 Novel Coronavirus • DNA: Deoxyribonucleic acid

• ECHR: European Convention of Human Rights • ECtHR: European Court of Human Rights • EU: European Union

• FRA: the European Union Agency for Fundamental Rights • GDPR: General Data Protection Regulation

• GPS:Global Positioning System

• LFRT: Live Facial Recognition Technology • NGO: Non-governmental organization

(4)

4

Chapter 1. Introduction 1.1. Background and research question

The digital age is the era of mass surveillance. It seems inevitable that technological developments bring more and more threats to individuals’ privacy and anonymity. First, the spread of street cameras has turned people into the object of constant video observation. Then, new technologies like GPS and smartphones have made it possible to establish one’s physical location by tracking his/her geographical movements through phone towers or satellites.1 After Snowden’s revelation

the expectations of anonymity in the internet have been dispelled as well. The usage of such mass surveillance techniques as interception of e-mails, internet browser history tracking, monitoring of web transactions through cookies have become the part of a privacy/security trade-off. As McClurg fairly noticed ‘mass surveillance advances at such a pace that no one knows how to bridge a gap between what is possible and what is permissible’.2

With the appearance of such innovations as artificial intelligence (AI) even more privacy-intrusive tools have become available to Governments. Live facial recognition technology (LFRT) is the latest example of state-of-the-art software that allows to establish person’s identity by comparing one’s digital facial images. The sophisticated algorithms have undermined the ability of individuals to remain anonymous in public places and once again confirmed Michael Froomkin’s thesis that mass surveillance is a privacy pollution.3

AI and LFRT are complementary technologies, which enhance each other and bring new possibilities and risks to society.4 First used for double-checking the personal data of those, who crossed international borders, facial recognition is now deployed in a variety of ways. Prevention of mass disorders, crime investigation, fight with coronavirus and even commercial marketing are only few examples of its possible application.5 However, LFRT raises several human rights

concerns. For instance, the European Agency for Fundamental Rights (FRA) in its recent report has underlined that the technology can possibly interfere with not only the right to privacy, but

1 Lidberg, Muller, ‘In the Name of Security Secrecy, Surveillance and Journalism’ (Anthem Press, 2018) p.16. 2 McClurg, ‘In The Face Of Danger: Facial Recognition and The Limits Of Privacy Law’ (2007) 120-7 HarvLRev 1870, 1872.

3 Froomkin, ‘Regulating Mass Surveillance as Privacy Pollution: Learning from Environmental Impact Statements’ (2015) 1-5 IllinoisLRev 1713, 1743-1744.

4 Wiewiórowski, ‘AI and Facial Recognition: Challenges and Opportunities’ (European Data Protection Supervisor’s blog, February 2020) <https://edps.europa.eu/press-publications/press-news/blog_en> accessed 16 July 2020. 5 Bicki and others, ‘Facial recognition technology: Supporting a sustainable lockdown exit strategy?’ (Lexology, May 2020) <https://www.lexology.com/library/detail.aspx?g=9bb4f37d-df25-4580-8eed-026f4725bae7> accessed 17 July 2020.

(5)

5

also with the freedom of assembly, principle of non-discrimination and rights of the child.6 In the

light of such concerns, the European Data Protection Supervisor, Wojciech Wiewiórowski,has even considered a possibility of ‘a ban or temporary freeze on some uses of the technology where its impact on society and the rights and freedoms of individuals is uncertain’.7

In 2019 LFRT was for the first time challenged in national courts. Popova v Moscow City Police8 and R (Bridges) v CCSWP and SSHD9 cases have justified the application of the software in the Russian Federation and UK. At the same time Belgium has banned the facial recognition cameras for law enforcement purposes.10 Such contradictions in States’ positions only increase the anticipations of future LFRT assessment by the European Court of Human Rights (ECtHR or European Court).

Being one of the most authoritative regional human rights tribunal, the ECtHR has already dealt with mass surveillance practices in Ráttvisa v. Sweden11 and Big Brother Watch v. UK12

proceedings. However, the European Court has never had a chance to assess AI or LFRT as for compliance with the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR).13 Human rights activists hope that in face of new challenges the ECtHR may decide to reconsider its basic doctrines and approaches and come up with new requirements regarding bulk interception of data and strategic monitoring technologies.14

Research question:

How the European Court would deal with potential LFRT complaints alleging violations of Article 8 ECHR based on the existing mass surveillance case-law?

Sub-questions:

6 European Union Agency for Fundamental Rights, ‘Facial recognition technology: fundamental rights considerations in the context of law enforcement’ (2019) <https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper.pdf > accessed 1 May 2020 (‘FRA Report’).

7 Wiewiórowski (n 4).

8 Popova v Moscow City Police [2019] Case No 02а-0577/2019 Savelovskiy District Court of the City of Moscow. 9 R (Bridges) v CCSWP and SSHD [2019] EWHC 2341 (Admin).

10 Gabriela Galindo, ‘No legal basis’ for facial recognition cameras at Brussels Airport’ (The Brussels Times, 10 June 2019) <https://www.brusselstimes.com/all-news/business/119384/air-france-could-cut-thousands-of-jobs-by-2022/> accessed 12 June 2020.

11 Centrum för Rättvisa v Sweden (App no 35252/08) (2018) ECHR 520.

12 Big Brother Watch and others v. the United Kingdom (App nos 58170/13, 62322/14 and 24960/15) (2018) ECHR 72.

13 Convention for the Protection of Human Rights and Fundamental Freedoms (European Convention on Human Rights, as amended) (‘ECHR’).

14 European Digital Rights, ‘Ban Biometric Mass Surveillance. A set of fundamental rights demands for the European Commission and EU Member States’ (May 2020) <https://edri.org/wp-content/uploads/2020/05/Paper-Ban-Biometric-Mass-Surveillance.pdf > accessed 1 July 2020.

(6)

6

1. What features of LFRT differ this state-of-the-art technology from other mass surveillance practices, recognised and justified by the ECtHR case-law?

2. Should the ECtHR treat “reasonable expectation of privacy” doctrine as a threshold to invoke Article 8 ECHR while establishing an interference with the right to privacy caused by LFRT deployment?

3. Is the usage of LFRT incompatible with the right to privacy? Is the ECtHR case-law on the justifiability of mass surveillance measures applicable to the deployment of LFRT?

1.2. Methodology and thesis structure

The prevailing type of research used in this master thesis is a classic legal research focusing on the description of the existing ECtHR case-law on mass surveillance and analysis of its applicability for resolving potential cases concerning LFRT as a novel strategic monitoring practice.

The main goal of Chapter 2 is to shed light on the definitive features of LFRT which cause new privacy concerns. The chapter comprises a historical approach used to illustrate how the legal views on bulk data interception have changed over time in the ECtHR’s rulings. It also includes limited descriptive empirical research to explain what technological characteristics differ LFRT from video surveillance and other privacy-intrusive practices.

The relevance of the ECtHR findings on CCTV monitoring and biometric processing to LFRT potential claims while establishing an interference with Article 8(1) ECHR is further assessed in Chapter 3. Based mostly on theoretical method, this section involves comparative legal research. The chapter offers the comparison between common law courts’ and the European Court’s positions towards “reasonable expectation of privacy” doctrine and determines similarities and differences between the UK and Russian national courts’ findings on LFRT. The evaluation of probability that the ECtHR would resort to the same line of reasoning as in R (Bridges) v CCSWP and Popova case would remain in focus during the whole research.

As the ECtHR has never dealt with LFRT complaints before, Chapter 4 concentrates on the attempts to make the best possible predictions on how the European Court would apply traditional three-prong test to the deployment of such state-of-the-art technology. Reasoning by analogy would be used extensively in order to establish if LFRT still fits in the general jurisprudence of the ECtHR. The goal of the chapter would be to assess whether the ECtHR’s basic approaches, including qualitative requirements of “the accordance with law” test, “pressing social need”

(7)

7

requirement and/or “strict necessity” test as well as “less restrictive means” doctrine would remain applicable while facing new privacy concerns as LFRT.

(8)

8

Chapter 2. What features of LFRT differ this state-of-the-art technology from other mass surveillance practices, recognised and justified by European case-law?

LFRT is a state-of-the-art technology that allows live biometric processing of video imagery in order to identify particular individuals.15 Its unique characteristics make it very difficult to compare LFRT with other mass surveillance systems, exploited by national authorities for security purposes. The obvious matching parallel with CCTV, which entails the use of security cameras in places of public gatherings, misses the main LFRT distinctive feature –an ability to establish one’s identity. While street cameras merely capture an image of an individual in public place, LFRT may provide an operator with a bulk of one’s personal information, raising additional privacy concerns and creating possibility of abuses. Thus, some scholars believe that LFRT comparison with fingerprint recognition is more appropriate,16 since both processes include sophisticated analysis of biometric data by few specially trained experts or, in case of LFRT, programs. However, while an ordinary person, indeed, cannot identify an individual by a fingerprint, face recognition is a task that is performed by all people on a daily basis.17 This is why supporters of this AI technology claim that if police officers have an already justified right to look through hours of CCTV footage in order to locate wanted persons whose face images they keep in memory, then why should we prohibit something that allows us to do it immediately and effortlessly? Moreover, LFRT can do more than any single police officer as no human can remember 750 faces at once.18

In light of these facts, why do human rights activists concern about the opportunities of the technology? And whether the ECtHR’s reasoning in the previous mass surveillance cases would still be relevant for resolving LFRT complaints, considering unique technological features of the software?

2.1. Distinctive features of LFRT as an emergent mass surveillance system

LFRT involves ‘the automated extraction, digitisation and comparison of the spatial and geometric distribution of facial features’.19 At the perception phase, a digital image of an individual is extracted from a live stream CCTV footage and converted into a digital template by complex

15 Fussey, Murray, ‘Independent Report on the London’s Metropolitan Police’s Services Trial of Live Facial Recognition Technology’ (2019) <https://48ba3m4eh2bf2sksp43rq8kk-wpengine.netdna-ssl.com/wp-content/uploads/2019/07/London-Met-Police-Trial-of-Facial-Recognition-Tech-Report.pdf> accessed 1 May 2020, p. 19.

16 Mann, Smith, ‘Automated Facial Recognition Technology: Recent Developments and Approaches to Oversight’ (2017) 40 UNSWLJ 121, 122.

17 Adler, Schuckers, ‘Comparing Human and Automatic Face Recognition Performance’ (2007) 37 IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 1248, 1248.

18 Golunova, ‘The effectiveness of the European Court of Human Rights for ensuring protection of the right to privacy in the paradigm of Live Facial Recognition’ (2019) (Master thesis, Tilburg University 2019).

(9)

9

algorithms.20 A computerized face perception is a technology that is incredibly difficult to

engineer.21 Not only faces must be detected in video images and extracted from the background, but they also must be “normalized” by the system, so they correspond to a standard format.22 Such digitalization usually involves the transformation of a face image into an object’s hash code or mathematical vector that is to be compared with the code or mathematical vector of the original photograph.23 The gained data is classified and subsequently stored in the database. However, from a privacy perspective, it is better not to save any information at all or to save hashes instead of images as they cannot be traced back to an individual by human operators.24 Many manufactures, indeed, implement this privacy-by-design measure. For instance, the system, deployed by the South Wales Police, does not retain image of persons whose faces are scanned.25 If no match is made, the data is automatically deleted.26

This is why the final “comparison stage” is a core of a process that usually results not in one but a range of possible matches, depending on so-called “similarity score”.27 This is a numerical value indicating the likelihood that the faces match.28 Contemporary systems allow the end users to change the “similarity score” settings at their own discretion, depending on the intended usage of LFRT.29 Setting the threshold too low will produce a lot of “false positives” and increase the risk

that an innocent person, wrongfully identified, would be scrutinized by the police. Setting the threshold too high, however, will result in too many “false negatives”, when wanted persons would not be recognized due to negligible differences between original and probe face images.30 As a

consequence, there is always a trade-off between “false positives” and “false negatives”.31 However, the accuracy of LFRT is usually criticized not for such a dilemma. A bias-deprived algorithm would have been expected to have similar false matches rates within different races, regardless of which ethnic group is under consideration.32 Unfortunately, it has proved not to be the case. An independent research has found that LFRT works significantly better for white men

20 Ibid.

21 Gates, ‘Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance’ (NYU Press, 2011) p. 5.

22 Ibid p.18. 23 Popova (n 8) p.7.

24 Cate, ‘The technical, legal and ethical aspects concerning the implementation of facial recognition technology in public spaces and the potential trade-off between public security and the privacy of individual citizens’ (2019) (Master Thesis, Vrije Universiteit Amsterdam 2019) 2.1.2.

25 Bridges (n 9) para 37. 26 Ibid.

27 Ibid para 24(6). 28 Ibid.

29 Ibid.

30 Bowyer, ‘Technology: Security versus Privacy’ (2004) 23(1) IEEE Technology and Society Magazine 9 -19. 31 FRA Report (n 6) p. 9.

(10)

10

than for black women.33 Although the issue of potential discrimination and bias is not solved yet,

such argument against LFRT is easily disproved by indicating that other-race effect is common not only to AI, but to human face recognition as well.34

Furthermore, when the LFRT software identifies a possible match it usually submits the results for a human review, meaning that an operator (e.g. police officer) makes a final decision. Some authorities believe that ‘the fact that human eye is used to ensure that an intervention is justified is an important safeguard’35 against possible discrimination.36 Human rights activists, however, are more skeptical, claiming that human operators overrule outcomes from algorithms mainly when the result is in line with their own stereotypes.37 Therefore, it is primary responsibility of the police forces, not the software developers, to ensure that neither LFRT not its means of deployment violates the prohibition of discrimination.38

Notwithstanding discussed accuracy issues, LFRT has proved to be effective for variety of tasks, namely: verification and identification. The former, known as “one-to-one match”, is frequently used at international borders as it involves comparison between only two biometric templates, assumed to belong to the same individual (e.g., one’s passport image and the image taken by the security cameras at the spot).39 In other words, the system verifies the similarity between these

photographs without checking other records in the database.40 Privacy concerns thus arise only regarding a specific person. Identification, though, means that the extracted data is checked against variety of other images (“one-to-many comparison”) contained either in the whole database (e.g., police custodial pictures database) or in special watchlists (e.g., the gallery of wanted people’s photos), created for each occasion that they are used.41 Such an identification process allegedly represents an interference with the right to privacy of many individuals, whose images are

33 Buolamwini, Gebru, ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research 81:1–15’ (2018) <https://dam-prod.media.mit.edu/x/2018/02/06/Gender%20Shades%20Intersectional%20Accuracy%20Disparities.pdf> accessed 17 June 2020, p.12.

34 Phillips and others, ‘An other-race effect for face recognition algorithms’ (2011) 8(2) ACM Transactions on Applied Perception 1-11, 14:10.

35 Bridges (n 9) para 33.

36 London Policing Ethics Panel, ‘Interim report on Live Facial Recognition’ (2019) <http://www.policingethicspanel.london/uploads/4/4/0/7/44076193/lpep_report_-_live_facial_recognition.pdf> accessed 9 June 2020 (‘Ethics Panel’), p.7.

37 FRA Report (n 6) p. 26. 38 Fussey&Murray (n 15) p. 40. 39 FRA Report (n 6) p.7.

40 Introna, Nissenbaum, ‘Facial Recognition Technology. A survey of Policy and Implementation issues’ (2010) LUMS Working Paper < https://eprints.lancs.ac.uk/id/eprint/49012/1/Document.pdf> accessed 17 June 2020, p. 11. 41 Ibid p.12.

(11)

11

processed in order to find a match. This is a starting point for LFRT to become a mass surveillance practice.

“Mass surveillance” is a term which is not legal in character,42 but definitive for human rights

bodies.43 Some scholars believe that mass surveillance ‘is not exhausted by bulk measures, when all communications are subject to interception, but can be represented even by targeted measures, provided that the scope of persons, whose data are collected is not sufficiently determined or limited’.44

The “one-to-many comparison” tool, indeed, usually targets not the whole population, but certain groups of people. However, it does not “sufficiently determine or limit” the categories of individuals that are to be included in its databases. For instance, it is unclear today who has a justified right to populate watchlists with images? Based on what criteria should persons be included in such watchlists? How long one’s image would be stored in a watchlist/database and does a person have a right to be forgotten? Does an independent body overview the compilation of a watchlist?

The watchlists used by the South Wales Police in R (Bridges) v CCSWP case, for example, included (a) persons wanted on warrants, (b) individuals having escaped from lawful custody, (c) persons suspected of having committed crimes, (d) persons who may be in need of protection (e.g. missing persons), (e) individuals whose presence at a particular event causes particular concern, (f) persons simply of possible interest to police for intelligence purposes and (g) vulnerable persons.45 However, uncertainty exists over what is meant by these categories. Indefinite phrases like “persons of interest”, “individuals of concern”, “vulnerable persons” provoke abuses and open the doors for LFRT deployment to the indefinite number of people.

Therefore, the “one-to-many comparison” function of LFRT can be regarded as a mass surveillance practice, which raises new challenges to human rights law (e.g., the formulation of watchlist compilation requirements). Is the European Court ready to update its approach to mass surveillance in the light of such modern technology or is there no need in precedents’ revision?

42 Venice Commission, ‘Report on the Democratic Oversight of Signals Intelligence Agencies’ (2015) CDL-AD(2015)011, para 64.

43ECtHR Press Unit, ‘Factsheet-Mass Surveillance’ (2019)

<https://www.echr.coe.int/Documents/FS_Mass_surveillance_ENG.pdf> accessed 18 June 2020.

44 Rusinova, ‘A European Perspective on Privacy and Mass Surveillance at the Crossroads’ (2019) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3347711> accessed 18 June 2020.

(12)

12

2.2.Mass surveillance in the ECtHR case-law

Seventy years ago, at the time of the Convention’s founding, one could not have been anticipated such advances in surveillance technology as LFRT.46 However, guided by the famous statement ‘the Convention is a living instrument which […] must be interpreted in the light of present-day conditions’,47 the ECtHR applies evolutive doctrine in order to address new public concerns. Such approach leads to the occasional re-assessments of previously developed standards. For instance, if earlier in Klass and Others v. Germany the Court understood the term “correspondence” as postal, telephone or telecommunications services,48 in Copland v. the United Kingdom the definition was broadened to e-mail correspondence and the Internet usage.49Furthermore, “living instrument” doctrine has allowed the Court to clearly distinguish between targeted wiretapping and mail seizure for the purposes of criminal investigation and bulk surveillance practices such as tracking the Internet browsing history.50

Thus, many authors agree that ECtHR’s position towards mass and secret surveillance has begun to crystallize only in the landmark Weber and Saravia v. Germany decision,51 but not in earlier

cases concerning the interception of private communications.52 Being rejected as manifestly

ill-founded,53 the application in Weber case, nevertheless, gave the Court the opportunity to elaborate on a minimum set of requirements, which should be applied to assess the predictability of the legal basis governing secret measures.54 Discussed in detail in Chapter 4, the Weber criteria have

determined the outcomes of the subsequent mass surveillance judgments, including Liberty and

Others v. UK55 and Kennedy v. UK.56 More importantly these requirements continue to be relevant for any potential claim regarding a privacy interference by strategic monitoring technologies, including LFRT. The Weber decision has underlined the importance of subsequent notification about surveillance measures, stating that as soon as it ‘can be carried out without jeopardising the purpose of the [monitoring], information should <…> be provided to the persons concerned.’57

46 Webber, ‘The European Convention on Human Rights and the Living Instrument Doctrine: An Investigation into the Convention’s Constitutional Nature and Evolutive Interpretation’ (2016) (PhD thesis, University of Southampton) p.30.

47 Tyrer v. the United Kingdom (App no 5856/72) (1978) ECHR 2, para 31. 48 Klass and Others v. Germany (App no 5029/71) (1978) ECHR 4, para 41. 49 Copland v. the United Kingdom (App no 62617/00 (2007) ECHR 253, para 41. 50 Golunova (n 18).

51 Weber and Saravia v Federal Republic of Germany (App no 54934/00) (2006) ECHR 1173. 52 E.g. Klass (n 48).

53 Weber (n 51) para 138. 54 Rusinova (n 44) p.5.

55 Liberty and others v. the United Kingdom (App no 58243/00) (2008) ECHR 568, para 63. 56 Kennedy v. the United Kingdom (App no 26839/05) (2010) ECHR 682, para 158. 57 Weber (n 51) para 135.

(13)

13

The long-anticipated Ráttvisa and Big Brother Watch cases have finally dispelled doubts on the permissibility of mass surveillance per se. These processes were strategic for a number of NGOs, which tried to persuade the ECHR to strengthen its approach to the assessment of secret monitoring.58 The Big Brother Watch applications, for instance, were lodged right after Edward Snowden’s revelation that had made applicants believe that most of their electronic communications and/or communications data were likely to have been intercepted by the UK intelligence services.59 In sum three questions were brought to ECtHR consideration: 1) compatibility of bulk surveillance with ECHR, 2) the evaluation of intelligence sharing regime, 3) admissibility of the acquisition of communications data by telecommunication providers.60 The last two criteria, however, are less relevant for the present research, since LFRT is not depended on telecommunication providers and so far is not used as a cross-border intelligence measure. Therefore, the Court’s findings on these matters shall be omitted.

In both above-mentioned cases the European Court concluded that the bulk interception regime is permissible because it falls into a States’ wide margin of appreciation in the matters of national security.61 However, due to the risk of misuse of power, mass surveillance measures should be subject to a number of safeguards. Being incomparable in length, the texts of the judgments present a rigorous elaboration of the content of such safeguards and an attempt to design a “road map” for the legal regulation of the mass interception of data.62 Despite mostly duplicating Weber criteria,

the ECtHR findings, however, represent certain derogations from the earlier Court’s case-law. First, the need for subsequent notification of the surveillance subject was rejected by the Court as unrealistic in cases of strategic monitoring and inconsistent with a State’s margin of appreciation.63 Second, the requirement for prior judicial authorization of bulk interception was also abandoned as being no more than a “best practice”.64 Instead the ECtHR found that ex post ‘independent oversight may be able to compensate for an absence of judicial authorization’.65 Finally, while in Weber the Court insisted that catchwords, used for monitoring purposes, should be listed in a

warrant,66 in Big Brother Watch the ECtHR concluded that the selectors and search criteria neither

58 Rusinova (n 44) p.10.

59 Voorhoof, ‘European Court of Human Rights: Big Brother Watch and Others v. the United Kingdom’ (2018) <https://biblio.ugent.be/publication/8588280/file/8588281.pdf> accessed 19 June 2020.

60 Big Brother Watch (n 12) para 269. 61 Ibid para 314.

62 Rusinova (n 44) p.11.

63 Big Brother Watch (n 12) para 317; Rättvisa (n 11) para 179. 64 Big Brother Watch (n 12) para 320; Rättvisa (n 11) paras 127-147. 65 Big Brother Watch (n 12) para 318.

(14)

14

need to be made public, nor to be included in any judicial writ.67 However, the search criteria and

selectors used to filter intercepted communications should be subject to a review by an independent body.68

The above-mentioned shifts in the Court’s position made certain scholars believe that the ECtHR had restricted the application of the right to privacy to an even larger extent than in the Weber decision.69 Indeed, based on these findings one can be reaffirmed that Governments should not inform the population about who can be included in watchlists or in what places LFRT can be deployed.

Others, however, notwithstanding such controversial conclusions, see the Big Brother Watch judgment as a clear win for privacy activists70 due to the obvious enlargement of the ambit of information, the interception of which can constitute an interference.71 The European Court has justified the position that ‘mapping of social networks, location tracking, internet browser tracking, the mapping of communication patterns, and insight into who a person interacted with’72 are no less intrusive than the acquisition of content.73 This is a progressive step, demonstrating the Court’s readiness to face new challenges in this digital age.

To sum up, the ECtHR have just started to outline its position on mass surveillance. However, the latest judgments set a benchmark against which any surveillance regime shall be scrutinized.74 The abolition of the “subsequent notification” and “judicial authorization” requirements demonstrates the Court’s generally lenient attitude towards bulk interception of data and seems to encourage LFRT deployment without informing the population about circumstances of its operation. For instance, guided by Big Brother Watch findings, Governments may decide not to make public watchlist compilation procedures or the list of places where the technology is used. However, due to the existence of “living instrument” doctrine, the demonstrated Court’s readiness to broaden the understanding of information, the interception of which infringes with the right to privacy, strong civil opposition,75 and unique features of LFRT, there is a hope that the ECtHR would reassess

67 Big Brother Watch (n 12) para 340. 68 Ibid.

69 Rusinova (n 44) p.14.

70 Milanovic, ‘ECtHR Judgment in Big Brother Watch v. UK’ (EJIL:Talk! 2018) <https://www.ejiltalk.org/ecthr-judgment-in-big-brother-watch-v-uk/> accessed 1 June 2020.

71 Rusinova (n 44) p.14.

72 Big Brother Watch (n 12) para 356. 73 Ibid.

74 Milanovic (n 70).

75 Sloot, Kosta, ‘Big Brother Watch and Others v UK: Lessons from the Latest Strasbourg Ruling on Bulk Surveillance’ (2019) 5 European Data Protection Law Review 2, p.261: After two separate requests the Grand Chamber panel decided on 4 February 2019 to refer Big Brother Watch and Others v UK to the Grand Chamber of the ECtHR in order to re-assess the overall compatibility of the UK bulk surveillance regime with the ECHR.

(15)

15

its approach and come up with new standards. The readiness of the tribunal to deal with LFRT potential cases would be evaluated in the following chapters.

(16)

16

Chapter 3. Reasonable expectation of privacy in the digital age: does the use of LFRT form an interference with Article 8 (1) ECHR?

The supporters of LFRT claim that being implemented in an open field facial recognition is not a threat to privacy of individuals as there could be no privacy in common places.76 In other words, by leaving their homes people expose their faces to public and, therefore, are ready to be observed by no matter occasional passerby or CCTV camera. The view seems to have been shared by most national courts. For instance, in United States v. Dionisio the US court stated that ‘no person can […] reasonably expect that his face will be a mystery to the world’.77 However, there is a difference between being subject to a random glance from a stranger and being identified in public by sophisticated algorithms capable of processing one’s facial features to establish individual’s name, address, occupation, criminal history or other personal information. 78 One may insist that people still exhibit subjective and actual expectations of privacy of their identities even while they are out in public.79 For instance, taking sensitive trips to doctor, pharmacy or even to grocery one assumes a minimal risk to be recognized.80 In the view of the above, the questions arise whether people lose their expectations of privacy by merely exposing their faces in public and whether such expectations became the key factor in establishing an interference with Article 8 (1) ECHR?81 3.1. The reasonable expectation of privacy doctrine: the common law approach

The ‘reasonable expectation of privacy’ test is the touchstone for privacy-related laws in common law jurisdictions.82 First developed in American case-law, the doctrine has evolved significantly and is now not only recognized by UK and Canadian courts,83 but has been mentioned by the ECtHR in several cases. Given the common law origins of the ‘reasonable expectation of privacy’ formulation it is worth to briefly outline its framework under US and UK case-law before speculating on the applicability of these standards in the ECtHR’s practice.

The limited interpretation of the right to privacy that spreads its scope only to private spaces as homes and aims to protect one’s property from physical intrusion is currently rejected by most

76 Yue Liu, ‘Bio-Privacy: Privacy Regulations and the Challenge of Biometrics’ (Routledge, 2013) ch 5.3.1.2.3. 77 United States v. Dionisio, 410 US 1, 15 (1973).

78 Wynn, ‘Privacy in the face of surveillance: Fourth Amendment considerations for facial recognition technology’ (2015) <https://calhoun.nps.edu/bitstream/handle/10945/45279/15Mar_Wynn_Eric.pdf?sequence=1&isAllowed=y> accessed 1 May 2020, p. 60.

79 Hirose, ‘Privacy in Public Spaces: The Reasonable Expectation of Privacy against the Dragnet Use of Facial Recognition Technology’ (2017) 49 Connecticut Law Review 1591, 1601.

80 Ibid.

81 ECHR (n 13) art 8(1).

82 Hughes, ‘A Common Law Constitutional Right to Privacy – Waiting for Godot?’in Elliott, Hughes (eds), Common

Law Constitutional Rights (Bloomsbury Publishing, 2020).

(17)

17

national laws. For instance, the US Supreme Court’s landmark Katz v United States judgment concluded that ‘Fourth Amendment protects people, not places’,84 therefore ‘what [a person] seeks to preserve as private, even in an area accessible to the public, may be constitutionally protected’.85 The case laid down the initial two-prong test inquiring into an individual’s “reasonable expectation of privacy”.86 First, has a person exhibited an actual (subjective) expectation of privacy?87 And, second, is society prepared to recognize such expectation as “reasonable”?88 The twofold requirement was subsequently supplemented by the “third-party”,89 “plain-view”90 and “the general public use”91 doctrines. These standards implicate that if something is exposed to many third persons or placed in plain view with no attempts to shield or cover it or when governmental authorities utilize a device that is in common use to discover the needed information, one cannot reasonably expect privacy. 92 For instance, the use of binoculars by the police is not considered as a threat to the constitutional right. However, the use of heat sensors that are able to prove person’s presence in a closed private premise by screening his/her heat signature is treated by US courts as an interference with privacy guarantees.93 The fact that an individual cannot assume that someone would apply so sensitive technology which is not available for general public allows to treat such practice as a circumvention of the need of a warrant and subsequent violation of the Fourth Amendment.94

The approach, however, seems to be ill suited for modern realities and puts the right to privacy on an unstable foundation. Justice Sotomayor in her concurring opinion to United States v. Jones fairly noticed in ‘the digital age […] people reveal a great deal about themselves to third parties in the course of carrying out mundane tasks’.95 When earlier the anonymity existed in crowds even though the individuals were in plain view, this is no longer true in this day and age. The ubiquity of image-capture devices in private hands signifies96 that people became much more tolerant regarding the invasion to their personal life. Moreover, the spread of digital facial recognition on social networks like Facebook97 leaves no question regarding the availability of the technology to

84Katz v Unites States, 389 US 347, 351 (1967).

85Ibid 351–52. 86Ibid 361. 87 Ibid. 88 Ibid.

89United States v. Miller, 425 US 435, 442-443 (1976).

90California v. Ciraolo, 476 US 207, 213-214 (1986).

91Kyllo v. United States, 533 US 27, 33-34, 40 (2001).

92Reidenberg, ‘Privacy in Public’ (2014) 69 U Miami L Rev 141, 144.

93Kyllo (n 91) 33-34,40.

94U. S. Const. amend. IV; Reidenberg (n 92) 154.

95United States v. Jones, 132 S. Ct. 945, 957 (2012) (Sotomayor, J., concurring).

96Reidenberg (n 92) 154. 97Ibid 149.

(18)

18

general public use and supports the idea that social norms about privacy have changed. In the face of this how can any notion of a reasonable expectation of privacy survive?98 The problems associated with both prongs of the test lead some scholars to the conclusion that the doctrine of reasonable expectation of privacy is a poor standard for deciding on the scope of privacy protection in the digital context99 and should be abandoned.100

The view is not shared, however, by the practice of UK courts where the individual’s “reasonable expectation of privacy” is the touchstone of Article 8(1) ECHR. 101 When no such an expectation is present, the disputed measure automatically falls outside the scope of the right to privacy, making Article 8 ECHR non-applicable.102 However, in contrast to the US case-law UK courts avoid developing universal standards such as “third-party”, “plain-view” or other doctrines and prefer to consider diverse factors on a case-by-case basis. For instance, the Court of Appeal in

Murray v Big Pictures Ltd. found that:

‘the question whether there is a reasonable expectation of privacy is a broad one, which takes account of all the circumstances of the case. They include the attributes of the claimant, the nature of the activity in which the claimant was engaged, the place at which it was happening, the nature and purpose of the intrusion, the absence of consent and whether it was known or could be inferred, the effect on the claimant and the circumstances in which and the purposes for which the information came into the hands [of a third party]’.103

Such flexible application of the test has permitted UK courts to conclude that when LFRT is in question a person is not deprived of expectations of privacy by merely being in public place.104 The extraction and processing of the biometric data that are possible due to such state-of-the-art technology take these cases well beyond the “expected and unsurprising” consequences of walking in streets.105 The fact that facial features are exposed to third persons, therefore, cannot serve as an excuse for denying their “intrinsically private” character.106 Ultimately ridges on a person’s fingertips are also observable to the naked eye,107 but it is indisputable that their expert analysis is

98Ibid 146.

99Crowther, ‘(Un)Reasonable Expectation of Digital Privacy’ (2012) 1 The Brigham Young University Law Review

343, 352.

100Etzioni, Rice, ‘Privacy in a Cyber Age: Policy and Practice’ (Springer, 2015) ch.1.2.

101R (Wood) v Commissioner of Police of the Metropolis [2010] 1 WLR 123 (CA) 22.

102Ibid.

103 Murray v Big Pictures Ltd. [2008] EWCA Civ 446; [2008] 3 WLR 1360 (CA) 36. 104 Bridges (n 9) paras 57,59,62.

105 Ibid 55. 106 Ibid 57. 107 Ibid.

(19)

19

capable of affecting individual’s private life as well as retention of such information is not regarded as neutral or insignificant.108 Is there a good chance that ECtHR will join the discussed common law approach and treat person’s expectation to privacy as a precondition to Article 8 ECHR protection?

3.2 Scope of Article 8 (1) ECHR and the European Court

Interestingly enough, the position taken by the European court is quite different, although it too refers to “reasonable expectation of privacy” formulation, it seems that a doctrine has not so far been applied as a threshold for engaging the Conventional right.109 On the contrary, the ECtHR emphasized that ‘a person’s reasonable expectations as to privacy may be a significant, although not necessarily conclusive, factor’.110 In Bărbulescu v. Romania the Court even omitted the question of whether the applicant had a reasonable expectation of privacy in the workplace and instead focused on the degree to which a particular measure sets back the privacy related interests of the applicant.111 According to Hughes the phrase ‘“reasonable expectation of privacy” appears to be judicial rhetoric for the ECtHR, rather than a tool for determining the application of Article 8 ECHR’.112 Therefore, the author believes that European judges should give more attention to the subjective element in order to accommodate a modern behavioral understanding of privacy.113

The limited role of “reasonable expectation of privacy’’ test in the ECtHR practice, however, seems not to preclude broad applicability of the Conventional right. Actually, the European Court rarely finds that Article 8(1) ECHR does not apply to given set of facts, demonstrating a very contemporary perception of privacy and avoiding developing doctrines or standards that would clearly demarcate distinction between private and public.114 Instead, the ECtHR reiterates that ‘there is a zone of interaction of a person with others, even in a public context, which may fall within the scope of private life.’115 Thus, many scholars share the view that ‘the scope of the right to privacy under the ECHR is wider than the current state of the U.S. Supreme Court doctrine on “reasonable expectations of privacy’”.116 Indeed, the fact that LFRT, for instance, is integrated in

smartphones by device developers can preclude US Courts from finding a violation of the

108 S.and Marper v. United Kingdom (App nos. 30562/04 and 30566/04) (2008) ECHR 1581, para 84. 109 Hughes (n 82) ch. V.

110 P.G. and J.H. v United Kingdom (App no 44787/98) (2001) ECHR 550, para 57. 111 Bărbulescu v. Romania (App no 61496/08) (2017) ECHR 754, paras 80, 140.

112 Hughes, ‘A Behavioural Understanding of Privacy and its Implications for Privacy Law’(2012) 75(5)

Modern Law Review 806-836, 825

113 Ibid 825-826. 114 Hughes (n 82) ch. V.

115 Couderc and Hachette Filipacchi Associés v France (App no 40454/07) (2015) ECHR 992, para 83.

116 Lipton, ‘Mapping Online Privacy’ (2010) 104 Nw. U. L. REV. 477, 484; Milanovic, ‘Human Rights Treaties and Foreign Surveillance: Privacy in the Digital Age’ (2015) 56 Harv Int'l LJ 81, 131.

(20)

20

constitutional right due the existence of “the general public use” doctrine, but it would not hamper the invocation of Article 8 ECHR before the European Court.

It can be concluded that in the ECtHR’s practice the declaration of the existence of individual’s reasonable expectations of privacy may be an additional factor supporting the applicability of Article 8(1) ECHR, but in no way a definitive one as to whether a person is permitted to the provision’s protection. However, in light of the broad understanding of the right by the European Court, possible States’ arguments that modern surveillance technologies have altered societal values and justified as an inevitable loss of the functional anonymity in public places117 have a good chance to be found ungrounded and could not preclude the application of Article 8(1) ECHR. The development of the tests or doctrines similar to US ones, on the contrary, may characterize LFRT operation as not a threat to privacy, so their introduction in the ECtHR case-law seems unnecessary.

3.3. Does LFR technology infringe the right to privacy?

3.3.1. From CCTV footage to processing of biometric information

It is well-established that the reach of Article 8 (1) ECHR is broad. The ECtHR has systematically held that the right to respect for “private life” includes not only “the physical and psychological integrity of a person”,118 but also retention and dissemination of personally identified and sensitive

data.119 The judgment of Reklos v. Greece has left no question that the right to the protection of

one’s image is one of the essential components of personal development and presupposes the right to control the use of that image.120

However, to conclude that CCTV surveillance in public spaces is a breach of privacy per se is to broaden Article 8 (1) ECHR to an extent that the European Court is not ready to accept.121 In the landmark Herbecq judgment, for instance, the ECtHR stated that mere monitoring does not interfere with the right to privacy.122 Given as nothing is recorded, the visual data obtained could not become available to the general public.123 Moreover, all that can be observed is public

117 Levinson-Waldman, ‘Hiding in Plain Sight: A Fourth Amendment Framework for Analyzing Government Surveillance in Public’ (2017) 66-3 Emory Law Journal 527, 549.

118 Pretty v United Kingdom (App no 2346/02) (2002) ECHR 427, para 61. 119 Z v Finland (App no 9/1996/627/811) (1997) ECHR 10, paras 95-97.

120 Reklos and Davourlis v Greece (App no 1234/05) (2009) ECHR 200, para 40.

121 Taylor, ‘State Surveillnace and the Right to Privacy’ in Dean Wilson, Clive Norris (eds), Surveillance, Crime and

Social Control (Routledge, 2017) 66-85.

122 Pierre Herbecq and the Association ‘Ligue des Droits de l Homme’ v. Belgium (App nos 32200/96 and 32201/96) (1998) ECHR Decisions and Reports, 1999, 92–98, 97.

(21)

21

behavior, thus, it does not matter who is watching: a person on the spot or one looking at monitors.124 The approach to video surveillance was further supplemented in the Peck v. United

Kingdom case, where the Court emphasized that the storage and more importantly subsequent

disclosure of the recorded material to the media imply a serious interference with the applicant’s private life.125 This does not mean, however, that Article 8(1) ECHR becomes applicable only in such radical cases that involve broadcasting video images on national television. In the Perry v.

United Kingdom, for instance, the applicant’s footage in a custody suite was only used for the

benefit of criminal proceedings and was shown to limited number of persons: those who were present during the trial in the public court room and witnesses during the identification procedure.126 However, the ECtHR explained that publication of security recordings in a manner or degree beyond that normally foreseeable can nevertheless trigger the interference with the right to privacy.127 Thus, not the existence of the cameras itself, but the subsequent disclosure of the footage was a decisive factor in the Court’s argumentation.

As LFRT is based on visual public monitoring, it may or may not involve the recording and storage of the material, but rarely pursues the goal to disclose any obtained information to general public. In this sense the technology is not different from CCTV or body-worn cameras. If the Court where to follow the same line of reasoning as in Herbecq and Peck, this would lead to the conclusion that LFRT does not amount to an interference with the right to privacy. However, the extraction of facial signatures from the live footage makes the circumstances of LFR operation very different from any previously reviewed by the ECtHR, that would require a different approach.

It is worth recalling at this point that despite the ECtHR’s lenient attitude towards video surveillance, the Court has attached fundamental importance to the protection of personal data as a part of the right to respect for private and family life. The landmark S.and Marper v. United

Kingdom judgment, for instance, proved that retention of biometric information in the form of

fingerprints, cellular samples and DNA profiles can interfere with the right to privacy as such data ‘contains unique information about the individual allowing his or her identification with precision in a wide range of circumstances’.128 Thus, any record of such information, notwithstanding its

objective and irrefutable character, gives rise to important private-life concerns and forms an interference with Article 8(1) ECHR.129

124 Ibid.

125 Peck v. the United Kingdom (App no 44647/98) (2003) ECHR 44, para 87. 126 Perry v. the United Kingdom (App no. 63737/00) (2003) ECHR 375, paras 40-41. 127 Ibid.

128 Marper (n 108) para 84. 129 Ibid para 85.

(22)

22

One cannot deny that LFRT involves the biometric processing of facial images for the primary purpose of determining a person’s identity (“one-to-many identification”).130 Therefore,

LFRT-derived biometric data is an important source of personal information, just the same as fingerprints and DNA samples. The above-mentioned characteristics of the technology make the S. and Marper standard more relevant for establishing an interference with Article 8(1) ECHR in potential cases concerning LFRT than the Court’s approach in CCTV proceedings. Such conclusion is supported by the recent R (Bridges) v. CCSWP case131 in UK, FRA’s findings132 and scholar’s opinion.133

3.3.2 Storing data as a precondition to an interference with Article 8 ECHR

Several counterarguments can be raised by the Governments regarding the existence of an interference and the applicability of the ECtHR’s case-law on biometric data. One of them is to underline that, unlike circumstances of the S.and Marper case, personal information extracted by LFRT is not retained and is deleted immediately from the programme.

It is true that the ECtHR case-law on data protection is primarily built around retention, not the usage of personal information. For instance, the Court constantly held that the ‘mere storing of data relating to private life of an individual amounts to an interference within the meaning of Article 8’ and that ‘[t]he subsequent use of the stored information has no bearing on that finding’.134 In PG v. United Kingdom the ECtHR also emphasized that while monitoring by technological means of the same public scene is not regarded as a threat to one’s right to privacy, ‘private-life considerations may arise […] once any systematic or permanent record comes into existence of such material from the public domain’.135 According to the European Court, however, the duration of such biometric data retention period is not necessarily a conclusive factor.136 Thus, if personal information is gathered and retained even for a moment it should nevertheless trigger the application of Article 8 ECHR. The similar reasoning was applied in R (Bridges) v. CCSWP case, where the majority of the Administrative Court had no doubts LFRT still necessarily implies that an individual’s biometric data is captured, stored and processed, even momentarily, before

130 FRA Report (n 6) p. 24. 131 Bridges (n 9) para 62. 132 FRA Report (n 6) p. 24. 133 Fussey& Murray (n 15) p. 36. 134 Marper (n 108) para 67. 135 P.G. v. UK (n 110) para 57.

(23)

23

discarding. 137 So the argument that LFRT locates and does not retain facial biometrics of persons

whose faces are scanned did not hold strong, at least before the UK courts.138

Furthermore, one should take into consideration the fact that not only circumstances of the initial biometric processing of images are troubling while speaking of the technology, but also procedures for compiling watchlists, the usage of sources of the initial images to populate watchlists as well as duration of such watchlist photographs retention can raise certain privacy concerns. According to Murray, each of these processes alone can constitute separate interferences with the right to private life.139 The European Court has also continually insisted that:

‘considering whether there has been an interference, it will have due regard to the specific context in which the information at issue has been recorded and retained, the nature of the records, the way in which these records are used and processed and the results that may be obtained.’140

For instance, in a case regarding the indefinite storage of the applicant’s custody photograph in a local database, the Court underlined that given that the police may also apply facial recognition and facial mapping techniques to the image, it has no doubt that the taking and retention of the applicant’s photograph amounts to an interference with his right to private life within the meaning of Article 8 (1) ECHR.141

In the light of this, it seems unlikely that possible complaints concerning LFRT would be rejected by the ECtHR at such an early stage by accepting that mere processing of data without any further retention could not interfere with Article 8(1) ECHR. Satisfied by the mere fact of gathering of biometric data, the ECtHR would most certainly proceed to the merits.

3.3.3. Identification with precision as a precondition to an interference with Article 8 ECHR

The second possible Government’s objection concerning the existence of an interference to the applicant’s right to privacy may allude that LFRT does not allow person’s “identification with precision”, but only establishes resemblance between the initial and analyzed image with certain degree of probability, sometimes lower than 70%.

137 Bridges (n 9) para 37, 59. 138 Ibid.

139 Fussey&Murray (n 15) 36. 140 Marper (n 108) para 67. 141 Gaughran (n 136) para 70.

(24)

24

It was precisely due to the lack of accuracy of the LFRT that the claims of Ms. Popova were rejected by the Savelovskiy district court of the city of Moscow.142 The applicant claimed that her participation in unsanctioned mass event (single-person picket), which led to an administrative fine and her arrest, was proved by no other evidence than CCTV footage with signs of LFRT application.143 Ms. Popova assumed that her image was added to a watchlist on the base of her active civil position and NGO activity as co-founder of an independent project aimed to stop domestic violence and protect women’ rights in the Russian Federation.144 Due to the fact that the

provided CCTV footage contained a purposeful approximation of the camera’s focus (image enlargement in 32 times) with fixing on the face of the administrative plaintiff, Ms. Popova had no doubts that her identity was established through LFRT algorithms.145 Moreover, the claimant insisted that the operation of “Urban CCTV System” equipped with LFRT in general violates her constitutional right to privacy as it gathers her biometric data as well as information about her movement without her consent.146 Surprisingly enough, the Savelovskiy Court concluded that:

‘Analytical algorithms [used by LFRT] can reveal the coincidence of the codes and mathematical vectors of the initial photograph and the analyzed video image with only a certain degree of probability (65%). Accordingly, the above-mentioned processes cannot serve for the purpose of a person identification. [...] Thus, in the absence of an identification procedure, video images of citizens cannot be considered biometric personal data.’147

As a result, the deployment of LFRT was not treated by the Moscow Court as an interference with right to privacy.148 Would the ECtHR follow the same line of reasoning and find that the S. and

Marper threshold, requiring biometric data to allow individual’s identification ‘with precision in

a wide range of circumstances’149 is not reached in LFRT? Would the GDPR definition of

biometric data that also points on the ability of information to ‘allow or confirm unique identification of natural person’150 be of use for States? After all, in Opinion 02/2012 the Article

142 Popova (n 8) p.9. 143 Ibid p.1. 144 Ibid. 145 Ibid. 146 Ibid p.2. 147 Ibid p. 7. 148 Ibid p. 9. 149 Marper (n 108) para 84.

150 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC OJ 2016 L 119/1 (‘GDPR’), article 4(14).

(25)

25

29 Working Party has stated that ‘images of individuals too far away or with blurred faces would most likely not be regarded as personal data’.151

Notwithstanding these facts, the European Court seems not inclined to take such a formalistic approach. While holding that ‘[a] person’s image constitutes one of the chief attributes of his or her personality, as it reveals the person’s unique characteristics and distinguishes the person from his or her peers’,152 it has never put additional requirements to the quality of such image. On the

contrary, in Peck v. United Kingdom the ECtHR implicitly confirmed that the applicant’s footage even partially masked did not exclude his identification by family, friends, neighbors and colleagues, therefore it still constituted an interference with the right to privacy.153 Similar to this, one can conclude that even when a similarity score that is taken to indicate a potential match is lower than 70%, the extracted facial images would still be recognizable to human operators that can compare them with the initial photos to confirm someone’s identity. In other words, matches generated by LFRT as positive are usually subject to human adjudication that finalizes the identification process. To deny LFRT-derived data of biometric characteristics based on low threshold value of the technology is to argue that a clear photo image of the individual also does not fall into the category of biometric data when processed by sleep-deprived police officer with lower facial recognition capability that can decrease sometimes to 40%.154

Thus, it can be concluded that according to the existing ECtHR case-law LFRT would likely to be regarded by the Court as a threat to the right to privacy. The fact that the software allows processing of biometric data makes the S.and Marper standard more suitable for the purposes of establishing an interference with Article 8 ECHR than the European Court’s approaches in earlier CCTV cases. However, it is undeniable that LFRT is a new challenge to the Conventional right. Therefore, possible objections that LFRT unlike DNA analysis may not retain any data and does not allow to establish one’s identity with 100% precision would most likely be dismissed as too formalistic and not meeting modern realities.

151 Article 29 Data Protection Working Party, ‘Opinion 02/2012 on facial recognition in online and mobile services’ (2012) 00727/12/EN WP 192, para 4.1.

152 Von Hannover v. Germany (No. 2) (App nos. 40660/08 and 60641/08) (2012) ECHR 228, para 96. 153 Peck (n 125) paras 55,87.

154 Beattie and others, ‘Perceptual impairment in face identification with poor sleep’ (The Royal Society Publishing, 2016) <https://royalsocietypublishing.org/doi/10.1098/rsos.160321> accessed 20 May 2020.

(26)

26

Chapter 4. Assessment of LFRT deployment by the ECtHR: possible outcomes

The fact that LFRT constitutes an interference with the right to privacy does not necessarily mean that it violates Article 8 ECHR.155 Any threats to privacy still can be justified if they pass a three-pronged test developed by the ECtHR. The legitimacy of interference implies that the last is performed in accordance with the law, pursues a legitimate aim as well as is necessary in a democratic society.156 The possible application of this test by the ECtHR in potential LFRT cases represents the main interest not only for the purposes of the present research, but for the future of LFRT deployment by European States, as it may either prohibit the usage of the software per se, or put additional requirements/safeguards on its operation.

4.1. Is LFRT operated in a legal vacuum? Establishing criteria of “the accordance with law” test in the ECtHR’s mass surveillance cases

“The accordance with law” test incorporates a number of different elements,157 relating to both the

existence of legal basis and the quality of it (public, precise, and foreseeable qualitative requirements).158 In 2019 FRA concluded that not only the deployment of LFRT is not regulated by a clear and sufficiently detailed legal framework on EU level, but that appropriate national legislation is often lacking as well, which is the reason several EU countries have not implemented such software.159 Legal uncertainty around the operation of LFRT urged the EU Commission to consider an imposition of a 3-to-5 years moratorium on the use of the technology in order to assess its impacts and develop risk management measures.160 However, without sufficient explanation,

the proposal was abandoned in the final draft of the Commission’s White Paper on Artificial Intelligence. The document just briefly mentioned that ‘the gathering and use of biometric data for remote identification purposes, for instance through deployment of facial recognition in public places, carries specific risks for fundamental rights’ as well as proposed to launch a broad European debate on the particular circumstances, if any, which might justify the use of AI for the purposes of LFRT.161 To date, it is therefore the responsibility of each Member State to duly

155 Murray, ‘Blog: Live facial recognition: the impact on human rights and participatory democracy’ (Center for public engagement, 2019) <https://www.essex.ac.uk/centres-and-institutes/public-engagement/facial-recognition> accessed 10 July 2020.

156 ECHR (n 13) art 8(2) 157 Fussey&Murray (n 15) 8.

158 Gorlov and Others v. Russia (App nos 27057/06, 56443/09 and 25147/14) (2019) ECHR 518, para. 97. 159 FRA Report (n 6) pp. 13, 33.

160 Commission, ‘Structure for the White Paper on Artificial Intelligence’ (2019) <https://g8fip1kplyr33r3krz5b97d1-wpengine.netdna-ssl.com/wp-content/uploads/2020/01/AI-white-paper-CLEAN.pdf> accessed 9 June 2020 p.15. 161 Commission, ‘White Paper on Artificial Intelligence- A European Approach to Excellence and Trust’ COM(2020) 65 final, pp. 21-22.

(27)

27

authorize remote facial identification and to establish a transparent legal framework for its deployment.

However, there is no explicit national legislation regarding the usage of LFRT either in UK or in Russian Federation. Notwithstanding this fact, both local courts found that the technology application is sufficiently governed by Data Protection laws principles that are well-known, comprehensive and provide sufficient regulatory control to avoid arbitrary interferences with Article 8 ECHR.162 The use of LFRT was also not considered to be contrary to secondary legislative instruments, since police officers are under an obligation to use in their work the achievements of science and technology as well as modern information and telecommunication infrastructure.163 According to the UK court ‘the fact that a technology is new does not mean that it is outside the scope of existing regulation, or that it is always necessary to create a bespoke legal framework for it’.164 Would the ECtHR share this view or would it be forced to conclude that LFRT has been operating in a legal vacuum?

According to the ECtHR case-law, ‘the law must be adequately accessible and foreseeable, that is, formulated with sufficient precision to enable the individual […] to regulate his conduct.’165 In

Big Brother Watch case the Court reiterated that ‘the domestic law must be sufficiently clear to

give citizens an adequate indication as to the circumstances in which and the conditions on which public authorities are empowered to resort to any [surveillance] measures.’166 It is also evident that

compliance with Article 8(2) ECHR would be scrupulously examined by judges in cases ‘where the powers vested in the state are obscure, creating a risk of arbitrariness especially where the technology available is continually becoming more sophisticated.’167 Finally, the S.and Marper judgment highlighted the need to precise safeguards that is ‘all the greater where the protection of personal data undergoing automatic processing is concerned, not least when such data are used for police purposes’.168 Are national data protection laws together with allegedly CCTV regulatory framework able to meet these high standards?

First, no current national legislation provides guidance as to compilation of watchlists. It is “not sufficiently clear for citizens” whose images would be added to the LFRT system and what would be the source of that images. Obviously, most photographs used to populate watchlists would be

162 Bridges (n 9) paras 87, 96. 163 Popova (n 8) p.6.

164 Bridges (n 9) paras 84. 165 Marper (n 108) para 95.

166 Big Brother Watch (n 12) para 306.

167 Catt v. the United Kingdom (App no 43514/15) (2019) ECHR 76, para 114. 168 Marper (n 108) para 103.

Referenties

GERELATEERDE DOCUMENTEN

In light of all of the above, the CJEU concluded that the draft agreement on the accession of the EU to the ECHR was not compatible with the EU Treaties, because: (i) it is

Now that it can be concluded that personal autonomy falls under the scope of the right to respect for private life (Art 8 ECHR), that it is primarily observed as an aspect of the

The national qualification of the Member State in question is used as a starting point and the national qualifications of all Contracting States can play a role if the ECtHR uses

Especially in a multilevel context, where the cooperation of national authorities plays an important role as regards the effectiveness of the European courts, it is important that

Interpretation of fundamental rights in a multilevel legal system : an analysis of the European Court of Human Rights and the Court of Justice of the European Union..

H ILF (1986): “The Role of Comparative Law in the Jurisprudence of the Court of Justice of the European Communities”, in M ESTRAL (ed.), The Limitation of Human Rights in

In de rechtspraak van het EHRM wordt veelvuldig gebruik gemaakt van teleologi- sche interpretatie, maar hierbij wordt minder duidelijk aangegeven hoe het EHRM heeft vastgesteld wat

relating to the dissertation Interpretation of Fundamental Rights in a Multilevel Legal System – An analysis of the European Court of Human Rights and the Court of Justice of