Big Question: School Monitoring App, Privacy and Security Issues!

Published by
Partha Pratim Mazumder
The foremost issue with Facial Recognition Technology (FRT) is that the technology is inaccurate and the App also wants total access to each subscriber & personal details.

 

The last time you used an app, did you read its privacy policy before clicking 'I Agree'? If not, don't worry — such policies are not designed for easy reading. With reams of legal jargon, most are as difficult to read as the Harvard Law Review. A human-centric study across India found that even people who couldn't read or write, when made aware of what they were consenting to, cared deeply about it. They wanted a square chance to assess trade-offs but felt they didn't usually have such a choice. Online "consent" is, therefore, a false choice for most Indians. However, consent is also the fulcrum of India's fast-growing data ecosystem.

Recently, SSA (Samagra Sikshya, Assam) had directed the teachers of all Governments and provincialised schools in the State to install RIIMS mobile app mobile application. RIIMS is not a government-owned app. This App belongs to a private company–RNIT SOLUTIONS and Services Private Ltd. While downloading the App RIIMS, it sought permission to access private information. In the future, via this application, the SDMIS (Students& Database Management Information System) and TDMIS (Teachers& Database Management Information System) and AI (artificial intelligence)-based attendance monitoring would be possible through facial recognition on a single platform. In the Process, it would jeopardise the privacy and security of teachers and students. Every time, Why only teachers are targeted? On which base, the Education department bound a teacher to upload his aadhaar card and bank details to a private company while seeking to install the app? The App also wants total access to each subscriber & personal details! Is This not a violation of privacy rights? Teachers have no objection to downloading the government-owned App, Diksha.

The foremost issue with Facial Recognition Technology (FRT) is that the technology is inaccurate. Complete accuracy in finding matches has not been achieved even in ideal laboratory conditions, and thus deploying this technology in the real world, where environmental factors play a major role in the quality of images, comes with the harms of misidentification (false positives) and failure to identify (false negatives). The National Institute of Standards and Technology (NIST) has extensively tested FRT systems for 1:1 verification and 1: many identifications, and how the accuracy of these systems varies across demographic groups. Research has shown that FRT is based not just on race but also on gender, with the error rate being the highest while identifying women of colour. Teachers' Right to Privacy will be severely impacted if such unregulated use of FRT is allowed to continue. The damage will not be limited to only the Right to Privacy, as various other rights are incidental on the Right to Privacy to be realised fully.

This was the view of the Supreme Court bench that held that the Right to Privacy is a Fundamental Right under Article 21 of the Constitution of India. At present, all ongoing FRT projects in India cannot fulfil these thresholds. If the use of FRT continues unregulated, it will have a chilling effect on the Right to Freedom of Speech and Expression, as well as on the rights to protest, form associations and move freely throughout the territory of India, as fear of identification and retaliation by the state would deter individuals. Therefore, FRT will affect not only the privacy of individuals, but also their personal liberties and autonomy.

Share
Leave a Comment