Conundrum of expectations: are courts prepared for challenges against facial recognition technology?

In the absence of frameworks for privacy and data protection, the existence of legislation dedicated to the regulation of State use of facial recognition technology (FRT) appears to be a distant dream. MUSTAFA RAJKOTWALA & TEJAS NAIK juxtapose the unregulated use of FRT in India with the judgements that treat privacy as a fundamental right.

——————

IN the past few years, India has seen a rise in the use of facial recognition technologies (FRTs) by ‘State’ actors. At this point, they have been deployed as part of police investigations (in New DelhiChennaiPunjab, and Hyderabad), identification of citizens at airports and railway stations, and for the identification of school students. Further, the government plans to launch the National Automated Facial Recognition System (NAFRS) under the ambit of the National Crime Records Bureau (NCRB) that will be used by police, pan-India. 

 At the outset, it is pertinent to understand the functioning of this technology. Based on machine learning (ML) and artificial intelligence (AI) tools, FRT systems host a number of algorithms to identify specific and distinctive details about a person’s face and facial features. These details are then stored onto a database for future reference. Since FRTs include algorithmic functions, the system can segregate, profile and compare data with other facial data that is stored on the database.

Law enforcement and governmental agencies have been utilising FRTs under the pretext that they (a) assist in preventing crimes and (b) aid in investigations. The notion of enhancement of “efficiency, transparency and accountability” in the course of social justice is the primary justification.

However, in the pursuit of this technological advancement, there are ethical and legal issues, primarily relating to the State’s responsibility in safeguarding an individual’s privacy, and in ideating constitutionalism. While who has access to this database is a question that a data protection framework can deal with, the liability in case of a breach of such data is yet to be conceptualised.

It is interesting to hypothesise how the Courts in India would respond to this technology if and when claims arise against it.

Safeguards that must exist as per precedent

The first pertinent point to engage with is that the collection of susceptible forms of personal data such as facial data/biometric data must be accorded the ‘highest degree’ of protection. In order to restrict privacy (which is a Fundamental Right), a test of proportionality affirmed in the Supreme Court judgments in the cases of K.S. Puttaswamy vs. Union of India (2017) (‘Puttaswamy’) and the K.S. Puttaswamy vs. Union of India (2018) (‘Aadhaar) cases must be looked into.

Further, it is to be noted that India does not have adequate legislation that includes individual data and privacy protection. In such a case, if a challenge has to be brought against the constitutionality of the technology, it would have to be done based on the parameters laid down in the aforementioned cases. The restrictions placed on the deployment of a technology that utilises the personal data of an individual (which extends to public spaces) must be (a) imposed by law; (b) have a suitable means of achieving a legitimate aim; (c) must be necessary and balanced, i.e. the restrictions must be the least restrictive of the alternatives available to achieve the said goal and on balance, must not disproportionately impact the rights of citizens; and (d) should have sufficient procedural guarantees to place a check against abuse of state interference.

These parameters are of relevance when analysing how courts will potentially respond when the use of FRTs is challenged before them.

Also read: How India’s Data Protection Regime Must Learn from the WhatsApp Privacy Policy Fiasco

With regard to parameter (a), a pertinent issue to note is that law enforcement agencies, apart from building their own technology systems, are entering into contracts with software companies and third-party organisations, in order to set up infrastructure and software. 

This raises several constitutional questions of legitimacy, as this exercise involves the personal data of individuals.

However, such a third-party contractual arrangement cannot be regarded a restriction imposed by law. Further, as highlighted in Supreme Court judgments such as that in State of Madhya Pradesh & Anr. vs. Thakur Bharat Singh (1967), an executive action (although not backed by a legislation) which acts to the prejudice of a citizen, must not be permitted by law.

Legitimate aim and disproportionate measures

With regard to parameter (b), to rationalise the necessity of this technology, law enforcement agencies argue that the legitimate aim is the prevention and investigation of crimes. But the question which must be answered before the court is whether FRT is the “suitable means” to achieve this legitimate aim.

For instance, the Delhi Police (under the Union Home Ministry), one of the foremost law enforcement agencies to actively deploy FRTs, has justified its use for security purposes during Independence Day and Republic Day arrangements. At various instances, Delhi Chief Minister Arvind Kejriwal has boasted about the robust FRT framework across the Union Territory, with approximately 1,826 cameras per square mile.

However, in 2018, the Delhi Police reported that the trial FRT system had an accuracy rate of merely 2% while identifying defaulters. Latest official figures report that only forty two arrests have been possible in New Delhi, up till 2021. This is one of the several examples of inconsistencies in the effectiveness of the technology, and the minimal benefit it contributes towards the investigation and prevention of crimes.

The crippling issues of lacklustre investigations and inordinate delays remain, for which the police are answerable. This huge density of cameras in and around Delhi has neither prevented crimes, nor has it made the city safer for women and children.

Furthermore, incompetent investigations by the police fail to secure convictions. In this case, it would be safe to argue that FRT has so far not aided law enforcement agencies in achieving any of the goals they claim to be working on through its application.

Also read: A Sound Data Protection Authority is the Need of the Hour

With regard to parameters (c) and (d), necessary and balanced are the two requirements of any restriction imposed, that has the effect of taking away an individual’s right to privacy. While answering this question, the court has stuck to the concept of proportionality, especially in cases like Aadhaar.

Intervening with proportionality on Section 57 of the Aadhaar Act, which allowed private companies access to the data collected by the Unique Identification Authority of India (‘UIDAI’), the Supreme Court excluded private parties from using Aadhaar as a form of identification and held that any legislation requiring this would also be unconstitutional. Proportionality, here, was the ‘stake’ of private entities when it came to the data of and the utility through Aadhaar, and the court while trying to maintain the status of the Aadhar Bill as a money bill, flagged the commercial exploitation of this data as a threat.

While scrutinising these ‘contracts’ between the various law enforcement agencies and private firms, the access to data that is contractually provided will play a huge role in determining proportionality.

For the most part, the courts are likely to rely on the affidavits of the State, which will likely provide that no private access to data has been provided.

Subjective application by courts

In the United Kingdom, in the case of Ed Bridges v. South Wales Police (2020), the Court of Appeal of England and Wales has stressed on the fact that having a legal framework is a necessary but not sufficient condition to authorise the use of FRT. The court went on to say that a law that authorises the use of FRT must be clear and lay down sufficient safeguards to check the exercise of discretion by law enforcement agencies.

Although the European Union and the United States are heavily debating the use of this technology (with San Francisco being one of the firststates to ban the same), this discussion is yet to commence in India.

While Puttaswamy in India maintains the status of privacy as a fundamental right, other judgements on the matter have failed to create a foundation of data protection laws. The infirmity of our legal system is not only that there are no laws which create a legal framework for FRT use, but also that we do not have an overreaching law for data protection.

Also read: UP Uses Facial Recognition Technology to Mete Out Discriminatory Treatment

Courts have held that privacy is a fundamental right, but for every claim against State (executive) action that has rendered this right violated, the courts have for the most part stood with the justifications provided by the Executive. In Govind v. State of Madhya Pradesh (1975), the Supreme Court consistently upheld the ‘Regulation’ that provided for surveillance of suspected people. This consistency has been traded for convenience in subsequent judgments like People’s Union for Civil Liberties v. Union of India (2003) (‘PUCL’), wherein the court noted the importance of privacy and also acknowledged that privacy was indeed a fundamental right for the individual.

However, when it came to the challenges and claims made against State action, it sided with the Executive. In PUCL, the court demanded stricter scrutiny of the procedure followed while tapping the phones of suspects. A similar observation was made by the Bombay High Court in Vinit Kumar v. CBI and Ors. (2019).

This trend of the courts not intervening can be seen in endless other examples, where if a procedural safeguard is already in place, the Court does not engage with the instant question of breach and merely directs the Executive to adhere to the guidelines more strictly. Therefore, if law enforcement agencies are able to produce and claim that legal parameters in the course of application are being adhered to, courts are unlikely to strike it down as a restriction not imposed by law.

While on one hand this trend makes the outcome of these claims ‘pre-defined’, where the court is more likely to stand by executive orders, on the other hand, it also elucidates the expectations of the court for qualifies as safeguards. It needs to be seen how the courts will respond to the lack of a framework on data protection and FRT vis-a-vis the submissions that will be made before it by the government.

(Mustafa Rajkotwala and Tejas Naik are both fourth year undergraduate law students at NALSAR University of Law, Hyderabad. The views expressed are personal.)