Reading time: 6-7 minutes.
Technology has thrown constant challenges to law. While advancements in technology have aided governments across the world in efficient discharge of their functions, especially, prevention, detection and investigation of crimes, citizens find their valuable rights being sacrificed in the deployment of these technologies.
One such advancement is the Facial Recognition Technology (FRT) which, as its name suggests, helps in identification of individuals by comparing and matching a photograph or a video recording against a database containing images of individuals.
This is currently used by law enforcement agencies in various countries to identify and track down crime suspect, missing individuals, unidentified dead bodies etc. However, it raises serious privacy concerns, among various other rights issues, due to its invasive and non-consensual nature.
India recently witnessed an uproar against the collection and storage of biometric information like fingerprints and iris scans for the Aadhaar database. The government was accused of creating a surveillance state and invading the privacy of individuals.
As the National Crime Records Bureau of India (NCRB) gears up to establish an Automated Face Recognition System (AFRS) to improve criminal identification and verification process, we have yet another state-employed biometric-based technology to worry about.
How does Facial Recognition Technology (FRT) work?
FRT works by creating a unique map of an individual’s face. This is done by identifying distinct points on the face, which do not change much with the age, like eye-sockets or nose shape. Once the mapping is done, any captured image of the individual, be it a digital image or a video shot, can be compared with a database of photographs of individuals and if a matching facial pattern is found, the identity of the individual is established.
FRT is currently used as a security feature in offices, mobile phones and even in airports; as a mode of knowing customer preference by business groups and most importantly, in criminal investigation process. The last use has been particularly controversial given the State’s role in it.
What is the NCRB Proposal?
The NCRB on 28th June, 2019 released a ‘request for proposal’ inviting bids to establish an Automated Facial Recognition System (AFRS) in the country. AFRS is intended to help in identification and verification of individuals from photos, digital images, digital sketches and even video shots by comparing the facial features with the facial images contained in an existing database.
Earlier, facial recognition was done manually in India. AFRS will make facial recognition easy by even enabling identification of an individual from a crowd. Its stated objective is to help in identifying criminals, missing children/persons, unidentified dead bodies and unknown traced children/persons.
The AFRS will be integrated with other biometric systems like the Automated Fingerprint Identification System so as to generate comprehensive biometric authentication reports of an individual. It will be a centralized web application hosted by NCRB and will be accessible to all police stations in the country.
The database against which the search would be made would contain photographs available in the databases of Crime and Criminal Tracking Network & Systems (CCTNS), Interoperable Criminal Justice System (ICJS), prisons, Immigration, Visa and Foreigners’ Registration & Tracking (IVFRT), the Khoya Paya Portal of Ministry of Women and Child Development which contains details of missing children, State or National Automated Fingerprint Identification System or any other image database available with police/other entity.
Photographs from newspapers, raids, sketches, those sent by people, etc. would also be added to the database. Most importantly, it is envisaged to capture face images from live CCTV feed and generate alerts if a blacklist match is found.
What are the concerns that AFRS raises?
NCRB’s proposal has evoked serious apprehensions regarding citizens’ privacy and freedom. Though AFRS’s ability to identify individuals and track criminals is laudable, the Orwellian system of surveillance that it tends to create impels us to question its legitimacy.
The implementation of AFRS, with its ability to receive real-time input from live CCTV cameras for processing, creates a situation wherein each time we walk past a CCTV camera, our facial features are recorded and compared with a large database of photographs.
What makes AFRS different and more dangerous than other biometric systems of identification like fingerprint and iris scans is its ability to operate without the knowledge and consent of targeted individuals. This continuous covert monitoring and surveillance of individuals puts into jeopardy the fundamental right to privacy of individuals.
In fact, the Supreme Court in its landmark judgment in Justice K.S. Puttuswamy (Retd). v. Union of India [MANU/SC/1044/2017] recognized that a person has an expectation of privacy even in public spaces. Any invasion into the right to privacy must now stand the three-fold test laid down by the Supreme Court in this case viz. legality, legitimate state aim and proportionality.
The requirement of legality postulates the existence of a law authorizing the restriction on the privacy of individuals. The AFRS does not have backing of any law to govern and regulate its operation. Sadly, we do not even have a data protection law to protect the personal data of individuals.
The second requirement mandates that invasion of a person’s privacy can be justified only when there is legitimate state aim. AFRS, with its stated objective of aiding criminal investigation, may fulfill this requirement as Supreme Court has recognized prevention of crimes as a legitimate state aim.
However, if the government goes beyond this stated objective and use AFRS as a surveillance tool to target specific group of persons, as is apprehended by many, then it will fail the test of legitimate state aim.The last and the most important requirement is that of proportionality which stipulates that the restriction on the privacy of individuals must not be disproportionate to the objective sought to be achieved.
Though AFRS may facilitate identification of criminals, missing children and unidentified bodies, is real-time monitoring using live CCTV feeds and collection of photographs from “any image database available with police/any entity” really called for? Such a system makes each and every one of us a potential criminal, by creating a biometric map of our face and comparing it with the vast database to find a potential match.
Though the Ministry of Home Affairs made a clarification on July 11 that AFRS will be integrated only with CCTNS (which contains a database of criminals, suspects, prisoners, missing individuals and unidentified dead bodies) and that it will only track criminals, the ‘request for proposal’ calls for integration with several databases including the IVFRT database.
Moreover, a system like AFRS cannot restrict its operation to only tracking criminals to achieve even its well-intended objectives. Recording and mapping every individual is necessary for the system to work. Apart from privacy, AFRS has the potential to restrict a citizen’s freedom of expression and freedom of assembly and association.
Due to constant scrutiny, a person may restrict himself from various activities, which though may be perfectly legal, may invoke the ire of the government or a powerful majority community. For instance, AFRS will enable the police to identify those who participate in a public protest or strike and this can lead to further surveillance and harassment. In this way, the democratic right to dissent may be muzzled.
To add to all these worries is the lack of accuracy and bias in the working of Facial Recognition Technology that is reported across the world. In August 2018, the accuracy rate of a facial recognition system used by the Delhi police was reported to be only 2%. Similar levels of accuracy and wrong identification of individuals by FRT has been reported in other countries as well.
It is also observed that facial recognition algorithms are better at recognizing men and fair skinned people but flatters when it comes to identification of women, children and dark-skinned people. NCRB’s move to conceptualize Automated Facial Recognition System comes amidst growing protest in countries like US, UK and China against the use of this technology.
In China, facial recognition system is reported to have been deployed to identify and target the minority ethnic community of Uighurs and this invoked widespread condemnation. It would be prudent if India takes a cue from the experience of other countries with FRT while devising its AFRS.
What is the current status of data protection law?
The recording, storing and processing of a biometric feature of individuals i.e. facial features, is sought to be undertaken despite India not having a data protection law to safeguard the informational privacy of individuals and protect their personal data from being misused. This absence of safeguards makes AFRS a potent tool for privacy abuses. Though Personal Data Protection Bill, 2018 was drafted by the government-appointed Srikrishna Committee, it has not yet been tabled in the Parliament.
The Bill, in its present form, provides various safeguards in collecting, storing and processing personal data of individuals so as to protect their privacy. S.2(21) of the Bill defines ‘harm’ that may be caused due to the failure to comply with its provisions to include any restriction suffered, directly or indirectly, on speech or movement or any other action due to a fear of being observed or surveilled and also any observation or surveillance that is not reasonably expected by the individual. It is exactly these harms that AFRS is apprehended to cause.
Though the State is granted wide exemption from the compliance of majority of the provisions of the Bill, S.43 does not allow the State to process personal data for the purpose of prevention, detection and investigation of any offence unless it is authorized by a law of the Parliament/ State Legislature and unless such processing is necessary for and proportionate to such interests being achieved. It is clear from this that had this Bill become an Act, the AFRS would not stand its scrutiny.
The way forward…
The massive surveillance regime that AFRS is capable of establishing evokes an uncanny resemblance to the telescreens that keep citizens under constant surveillance in the dystopian novel 1984 by George Orwell. Despite the good intentions that may have been behind its introduction, one cannot remain blind to the dangers that AFRS, in its present form, poses to the privacy, security and freedom of individuals. As the UK news publication The Guardian commented, facial recognition is a danger to democracy.
The perception that technology always improves criminal investigation needs to be reassessed, given the inaccuracies that facial recognition algorithms are reported to suffer from. Being wrongly identified as a criminal is even more menacing than being under constant monitoring. Either the scope of operation of AFRS must be restricted and strong regulations be formulated to prevent its misuse or the system may altogether be discarded since it does more harm than good.
-This article is brought to you in collaboration with Merrin Muhammed from National University of Advanced Legal Studies, Kochi.