ylliX - Online Advertising Network
Privacy marked absent? IFF writes to government departments against their use of Aadhaar biometric and facial recognition enabled attendance systems

Privacy marked absent? IFF writes to government departments against their use of Aadhaar biometric and facial recognition enabled attendance systems


This post was updated on October 01, 2024.

tl;dr

In recent years, there have been rising instances of facial recognition technologies and biometric scans being deployed by India’s union and state government agencies to record attendance. This technology is being deployed at public offices, hospitals, and schools without adequate checks or safeguards, endangering vulnerable data of citizens. We wrote to the Secretaries of Delhi Education Department, Delhi Transport Department, National Medical Commission, and the Bar Council of India noting our concerns, urging them to take necessary steps to preserve the privacy of all implicated parties, and recall this action with immediate effect. 

Important Documents:

  1. IFF’s Letter to Delhi Education Department regarding facial recognition technology in schools dated September 11, 2024. (Link
  2. IFF’s Letter to Delhi Transport Department on the biometric facial recognition software for bus drivers dated September 11, 2024. (Link)
  3. IFF’s Letter to National Medical Commission on their Aadhaar-enabled biometric attendance system dated September 11, 2024. (Link)
  4. IFF’s Letter to the Bar Council of India on their notification to introduce CCTV cameras and make biometric attendance mandatory in CLEs dated October 01, 2024. (Link)

Background

There has been a rising trend in the use of attendance systems which use biometric data such as facial features, retina, iris, or fingerprints to log the attendance of individuals. At the Union government level, the Ministry of Personnel, Public Grievances & Pensions introduced AEBAS in 2014 and urged compliance with the same by all the Union government offices in 2023. This is also being done by various state governments, who have been deploying AEBAS or similar biometric systems and facial recognition tools for employees at government offices, hospitals, buses, and in schools. Some of these even extend to office visitors and students sitting for exams. Below is an non-exhaustive list of some examples.

In offices: In December 2022, the Andhra Pradesh government had issued an order mandating a facial recognition based attendance system app for all state government employees from January 2023. Geo-location, based on an employee’s role, was also fed into the app, which meant that the attendance—taken through facial recognition—was only acceptable if the employee was in and around the office premises. 

In schools: The trend of using biometric attendance systems is also being seen in schools. In Telangana, the School Education Department mandated teachers and non-teaching staff in government schools to record “geo-attendance” through a mobile app. The state government also floated a tender last year to replace oral roll-calling with a facial recognition system for attendance and to acquire 500-6000 biometric registration devices for attendance of candidates at examination centers for a year.

Bus drivers: Recently, it was reported that Delhi’s transport department is planning to introduce a biometric facial recognition software for attendance of bus drivers. The announcement did not talk about data security measures to protect the biometrics of bus drivers nor did it mention the company working with the Government to maintain these datasets. 

There have also been instances where private companies are being hired by government agencies to facilitate implementation of schemes without any checks. A private firm in Chennai had come under scrutiny in January 2023 for collecting sensitive biometric data of thousands of people to authenticate beneficiaries of a welfare scheme for the Greater Chennai Corporation, without the latter’s knowledge. A pertinent issue was an absence of a legal contract or a tender under which the private company was collecting people’s biometric data.

Why should you care?

The increased prevalence of the use of biometric technology to identify and monitor people raises some serious human rights concerns. A major concern is that the Digital Personal Data Protection Act, 2023 (“DPDPA”), has not been operationalised till now. In the absence of an implemented data protection law, any data collection exercise can result in harmful breaches of privacy. Additionally, even once the law comes into operation through notified Rules, the conditions for data collection and processing by private and government agencies remain unclear and broad, thereby exposing people’s personal data to privacy issues and threats of surveillance without a set remedial recourse. 

Unchecked use of biometric data gravely injures privacy

Biometric information is a unique form of personal, sensitive, and identifiable information. The use of facial recognition systems could result in additional screening measures for those categories which have historically lower facial recognition accuracy rates such as women and people with darker skin, and false negatives could result in them not being identified as themselves correctly. This can have repercussions on wages as the person may be falsely marked as absent, and impact performance reviews overtime. The nature of biometric data itself increases the harms that may result from any data breach or misuse as well. Unlike passwords, biometric data cannot be changed and thus, once breached the harms may be irrevocable. 

There needs to be specific discussions about a higher level of protection for our facial information in general—which the current data protection regime patently lacks. The DPDPA does not classify ‘sensitive personal data’ as a distinct category needing additional safeguards and caution, like its earlier versions or even the Information Technology, (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011. In the 2011 Rules, biometric information was included in the definition of ‘sensitive personal data’ and given a special layer of protection in the way of strict consent mechanisms, safety audits, prescribing security practices and procedures for handling such data, and so on. Global instruments, too, recognise the sensitive nature of biometric information such as facial data and the vulnerable position the processing of such data may leave the data principles in. As the European Council’s ‘Guidelines on facial recognition’ recognise, 

Considering the potential intrusiveness of these [facial recognition] technologies, legislators and decision makers have to ensure that an explicit and precise legal basis provides the necessary safeguards for the processing of biometric data. Such a legal basis will include the strict necessity and proportionality of their use and will take into consideration the vulnerability of the data subjects and the nature of the environment in which these technologies are used for verification purposes.

Therefore, until specific Rules under the DPDPA prescribe higher standards for processing sensitive information such as facial biometric data, tools such as AEBAS and facial recognition based attendance systems will operate without appropriate privacy safeguards.

Biometric attendance systems fail the Puttaswamy standard

In addition to biometric data, such biometric attendance systems also collect other information, such as geo-location data, which may also lead to potential harm. In some instances, these systems are being used for students who are minors. Usage and propagation of such a technology violates the fundamental right to privacy of students, teachers/invigilators and employees which is accorded to them vide Article 21 of the Indian Constitution. This can be tested through an assessment against principles set forth by a nine-judge bench of the Supreme Court in K.S. Puttaswamy v. Union of India [(2017) 10 SCC 1] (“Puttaswamy”).

The first requirement to be satisfied by the State when it intervenes to encroach on privacy is the existence of a law. There is currently no legislative sanction for mandating the use of compulsory biometric attendance system by either the State or any private entity. The Supreme Court while deciding upon the validity of Aadhaar in Puttaswamy noted that an executive notification does not satisfy the requirement of a valid law contemplated under the Puttaswamy principles. A valid law in this case would mean a law passed by Parliament, which is just, fair and reasonable. Any encroachment upon the fundamental right cannot be sustained by an executive notification. As it stands, there is no specific law, policy or act of parliament which governs the inception and operation of AEBAS or facial recognition systems. 

The judgment also laid down certain thresholds which have to be fulfilled to justify state intrusion into the right to privacy guaranteed to citizens in its decision. These thresholds are legality, necessity, proportionality and procedural safeguards. The various biometric systems fail to fulfil the legality threshold as it does not take place within a defined regime of law i.e. there is no anchoring legislation, with a clear set of provisions for remedy. Further, it fails to fulfil the thresholds of necessity (which justifies that the restriction to people’s privacy is needed in a democratic society) and proportionality (where the Government must show that the intrusion is proportional to the necessity and that there are no other alternatives which can fulfil said mandate). 

In this particular case, it becomes difficult to ascertain whether a legitimate or proportionate objective is being fulfilled. These measures are being taken to allegedly ease the administrative burden of different processes and to curb unfair practices in recording attendance or for dispersing salary. It has not been demonstratedly proven that there is any duplication or errors in conventional attendance systems; nor has the goal towards which AEBAS will serve been laid down in policy or law (except the speculation that this is being done to ease human resources). The deployment of compulsory biometric systems over large segments of the population for varied reasons may be regarded as disproportionate to the objective sought to be achieved. Additionally, mere convenience cannot be justified as a necessary restriction on privacy.

With regard to ‘proportionality‘, the court while rejecting the requirement of mandatory linkage of bank accounts with Aadhaar noted that imposing such a restriction on the entire population, without any evidence of wrongdoing on their part, would constitute a disproportionate response. The same reasoning can be applied in this case as well, the various authorities demanding biometric attendance without any urgent need is disproportionate. Rather the government could adopt an alternative, less intrusive mechanism to achieve the said objective, or seek to resolve the issues (if any) with conventional methods of collecting attendance. 

Exclusion risks

The use of biometric attendance systems can also lead to exclusion which could result in a failure to mark the attendance of the individual. This could have serious consequences, especially if attendance is linked to the calculation of wages or salary. For example, the Aadhaar project was linked with several welfare schemes, with a view to streamline the process of service delivery and remove false beneficiaries. However, a major problem that was encountered while implementing Aadhar was with authentication. While drawing their benefits people have encountered various issues due to connectivity issues, electricity supplies, biometric authentication failures and many more. This becomes especially problematic for people who are not literate, immobile, senior citizens or live in places where mobile and internet connectivity along with electricity supply is erratic. Some exclusion results from errors in the Aadhaar database, which aims to serve as the single source of truth.

Therefore, exclusion can happen because of various reasons such as error-prone apps, and lack of access to the internet or mobile phone devices. These policies also do not contain any provisions which relate to recourse and/or punishment in case of violations nor does it have any procedural safeguards in place to prevent misuse.

Vulnerability of Aadhar data and AEBAS

The government‘s history with data breaches, specifically in the case of Aadhaar, throws into question the vulnerability of any potential biometric system. Issues of individuals Aadhaar data being made publicly available and being accessible to government officials who lacked proper authorization, represent just some of the cybersecurity issues (here and here) that any future database will be required to overcome. In our letter to the Computer Emergency Response Team (CERT-In) regarding a breach of Aadhaar data, we recalled numerous instances of Aadhaar data being leaked or breached in the recent past and called to action the responsible Union and state level agencies to cure this dangerous trend. Furthermore, the existence of fraud and identity theft that have arisen in the case of biometrics data could compromise any future system that will rely on or integrate any pre-existing facial or biometric data that has been collected for Aadhaar. 

As per various news reports, Indian intelligence agencies have also cautioned regarding  the vulnerabilities that people’s sensitive data is exposed to, through these biometric attendance systems. The investigation also pointed towards the potential misuse of data collected by these firms. Various state government employees have also opposed these facial recognition and biometric attendance systems on the grounds of poor network connectivity in some areas and the threat to privacy and personal data. They claim that the salary of some of them were “wrongly” deducted based on data provided by these systems.

Our requests

We believe that the proliferation of AEBAS and facial recognition attendance systems as well as the move to make biometric attendance compulsory in various organisations, patently violates the fundamental right to privacy of individuals and might lead to data maximisation. It not only fails to meet the purported objective of curbing unfair practices, but its compulsory use also exposes the vulnerability of state agencies not equipped to develop and maintain such sensitive information as has been observed in the past data breaches. We urged the administrating agencies to cease the use of such privacy invasive systems.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *