Follow Us:

ABSTRACT

Facial Recognition Technology (FRT),  as a subset of biometric surveillance,  represents a potent convergence of artificial intelligence,  state power,  and personal data extraction.  This research paper examines constitutional tension arising from the deployment of facial recognition systems by Indian law enforcement without a robust legislative mandate.  Anchored in principles of privacy as laid down in Justice K.S.  Puttaswamy v.  Union of India,  this study critiques the proliferation of FRT projects like the National Automated Facial Recognition System (NAFRS) in the absence of statutory safeguards.  The paper investigates the interplay of technological functionality and legal lacunae,  particularly concerning data protection,  transparency,  proportionality,  and due process.  Comparative references to the EU’s General Data Protection Regulation (GDPR),  EDPB Guidelines,  and jurisprudence such as the Bridges case in the UK are used to illuminate normative best practices.  The study outlines systemic risks of biometric surveillance—false positives,  discriminatory biases,  and chilling effects on democratic freedoms—while situating these in the broader Indian regulatory context under the Digital Personal Data Protection Act,  2023.  The methodology involves doctrinal legal analysis,  case law interpretation,  and critical review of policy literature.  This paper contributes to urgent discourse on AI governance and constitutionalism by advancing a privacy-protective legal framework for FRT in India.

I.  INTRODUCTION AND BACKGROUND

1.  Introduction and Background of Research Problem

Biometric authentication has evolved from fingerprinting to facial analysis embedded with neural networks and high-resolution surveillance cameras.  Facial Recognition uses supervised learning models trained on massive datasets collected often without informed consent or legal checks.  FRT operates by measuring facial landmarks and matching them with reference datasets,  often susceptible to errors and demographic biases.  It is increasingly deployed in airports,  train stations,  public streets,  and consumer applications under the guise of security enhancement.  States and private actors exploit FRT to monitor populations in real-time,  creating digital identities that individuals cannot control or opt out from[1].

In India, the National Crime Records Bureau’s NAFRS project proposes centralised,  interlinked face databases for seamless search by police units nationwide.  There exists no enabling legislation that regulates the scope,  data collection mechanism,  or oversight of NAFRS or related deployments.  Pilot deployments in Delhi,  Hyderabad,  Chennai and Rajasthan lacked pre-legislative scrutiny or impact assessments on fundamental rights[2].  The absence of transparency or public consultation in FRT rollouts erodes citizen trust and undermines democratic accountability.  FRT in India functions without data minimisation,  purpose limitation,  or algorithmic audit protocols mandated in global privacy frameworks[3]. In Justice K.S.  Puttaswamy v.  Union of India,  Supreme Court affirmed privacy as intrinsic to dignity under Article 21 of the Constitution[4].  The Court introduced a four-prong test of legality,  necessity,  proportionality and procedural safeguards for any privacy intrusion.  Facial recognition,  by enabling covert,  persistent,  and mass-level surveillance,  directly affects the right to informational self-determination.  Algorithmic opacity in FRT obstructs meaningful challenges and redress by data principals under the standard of procedural due process.  Puttaswamy’s recognition of informational privacy as part of decisional autonomy mandates state restraint in non-consensual biometric tracking[5].

The European Data Protection Board’s Guidelines 05/2022 warn against the use of FRT in public spaces without strict necessity and legal clarity[6].  The EU Fundamental Rights Agency (FRA) identifies FRT as high-risk technology with chilling effects on freedom of speech and assembly[7].  Internationally,  jurisdictions like San Francisco,  Brussels,  and Boston have imposed moratoriums or outright bans on facial recognition.  Global civil society coalitions urge for pre-emptive regulation,  mandatory data protection impact assessments,  and sunset clauses.  The UN High Commissioner for Human Rights has called for a global moratorium on AI systems with serious human rights implications[8].

2.  Research Objectives

The researcher has formulated the following research objectives:

1.To examine the constitutional compatibility of FRT deployment with the right to privacy under Indian law.

2. To assess statutory gaps and regulatory inadequacies in governing biometric surveillance by public authorities.

3. To compare Indian practices with global benchmarks in FRT governance, oversight,  and proportionality.

4. To propose normative legal safeguards and legislative recommendations rooted in privacy jurisprudence and data protection principles.

3.  Research Questions

The researcher has formulated the following legal research questions:

1)Whether the current legal framework in India, including the Information Technology Act,  2000,  and the Digital Personal Data Protection Act,  2023,  sufficiently addresses the constitutional implications of facial recognition technology?

2) How does principles of proportionality, necessity,  and due process,  as articulated in Justice K.S.  Puttaswamy v Union of India (2017) 10 SCC 1,  can be applied to assess the legality of deploying FRT by law enforcement and administrative authorities?

3) In comparison with global standards, particularly the EU’s Law Enforcement Directive and EDPB guidelines,  what lessons can Indian regulators learn and adopt for balancing security imperatives with privacy rights?

4) Whether the use of FRT disproportionately impact vulnerable communities, thereby creating risks of discrimination and violation of the right to equality under Article 14 of the Constitution?

5) What specific regulatory framework and judicial oversight mechanisms are necessary in India to reconcile technological progress with the protection of fundamental rights?

4. Research Hypotheses

The researcher has formulated the following legal research hypotheses:

1.FRT deployment without legislative mandate contravenes the proportionality principle established in the Puttaswamy judgment.

2.Indian regulatory architecture does not provide effective remedies or rights-based controls over biometric surveillance systems.

3. International best practices offer actionable models for regulating FRT within constitutional democracies.

4. A rights-based data governance framework can reconcile innovation with individual autonomy and democratic oversight.

5. Research Methodology

The study follows a qualitative doctrinal legal research methodology involving critical analysis of constitutional and statutory texts.  Primary sources include Indian judgments,  statutes like DPDPA 2023,  and international legal instruments on data protection.  Comparative insights are drawn from foreign case laws,  EDPB Guidelines,  and GDPR provisions on biometric data.  The research engages in interdisciplinary analysis incorporating legal theory,  technological principles,  and human rights discourse.  Secondary sources include journal articles,  white papers,  and policy briefs by legal scholars,  technologists,  and civil society bodies.

6. Literature Review

European Union Agency for Fundamental Rights (2019) highlights fundamental rights challenges of live facial recognition,  noting risks of mass surveillance and discriminatory errors.  It urges robust safeguards to prevent abuse of this technology[9].  The European Data Protection Board (2023) guidelines emphasise that FRT involves sensitive biometric data.  They require strict necessity and proportionality for any law enforcement use,  and caution against deploying FRT without a clear legal basis or oversight[10].  European Parliament (2021) called for a moratorium on police use of FRT in public spaces due to inaccuracies and privacy intrusions.  It urged banning biometric mass surveillance technologies that undermine civil liberties[11].  Bridges v Chief Constable of South Wales Police (2020) was a landmark UK case.  The Court of Appeal held that police use of live FRT was unlawful due to a lack of clear legal authority and privacy safeguards[12].  In India,  Justice K.S.  Puttaswamy (Retd.) v Union of India (2017) recognised privacy as a fundamental right under the Constitution.  This forms the basis for testing surveillance technologies like FRT against the requirements of legality,  necessity and proportionality[13].  Despite this recognition,  India had no data protection law until the Digital Personal Data Protection Act 2023.  The Act’s efficacy in curbing FRT-related privacy risks and misuse by authorities remains untested[14].

Legal scholarship has similarly scrutinised FRT.  Venkitesh M J (2024) compares FRT regulation in India and the UK,  finding that the absence of data protection law in India exacerbates privacy risks.  He urges robust legal frameworks drawing on principles from Puttaswamy and Bridges[15].  Nesterova (2020) examines global pushback against FRT,  highlighting European regulators’ scrutiny under GDPR.  The study notes how moves like San Francisco’s ban in the US reflect backlash against intrusive surveillance[16].  O’Flaherty (2020) underscores that FRT’s deployment can impact the spectrum of rights beyond privacy – including dignity,  equality,  free expression and assembly.  He warns that even error-free use can chill democratic freedoms and calls for transparency,  accountability and strict oversight[17].  Simonitis (2021) analyses the uncertain status of FRT under US constitutional law,  flagging threats to privacy,  anonymity and free association.  The author suggests interim guidelines to restrain law enforcement use until legislators or courts act[18].  Vedavalli et al.  (2021) examine Indian police adoption of FRT for criminal investigations,  acknowledging potential benefits but warning of unchecked surveillance, bias and function creep.  They call for explicit regulation and independent oversight to prevent abuse[19].

II.  FACIAL RECOGNITION TECHNOLOGY: NATURE,  FUNCTIONING AND RISKS

1. Nature of FRT as Biometric Technology

Facial Recognition is a biometric modality that captures unique,  immutable facial templates for computational comparison.  Identification mode seeks to detect and match unknown faces against a database without user interaction or awareness.  Verification mode confirms whether the claimed identity matches the live image of the individual provided at the point of access.  Live FRT captures faces in real time through CCTV or drone footage and cross-checks against enrolled databases automatically.  Automated FRT requires no human intervention during detection,  matching,  or alert generation and often lacks accountability trails.

2. Technical Functioning

Training datasets for FRT are created through mass image scraping,  including social media,  ID databases,  or surveillance footage.  Machine learning algorithms detect nodal points and convert facial geometry into mathematical vectors for biometric matching.  High accuracy under laboratory conditions does not translate into real-world reliability,  especially for non-white,  non-male subjects.  Cloud-based facial databases pose cybersecurity and leakage risks,  especially without encryption or access logging.  Biases baked into algorithmic systems propagate structural discrimination and undermine fairness in law enforcement outcomes.

3. Risks and Limitations

False positives implicate innocent individuals and can lead to unlawful arrests,  harassment,  or denial of services.  False negatives exclude rightful claimants from services like welfare delivery,  access control,  or verification-based entitlements.  Studies show facial recognition performs poorly for darker-skinned women,  raising risks of algorithmic exclusion and prejudice[20].  Use of FRT to monitor protests or track individuals generates chilling effects and suppresses democratic participation.  Without meaningful opt-out mechanisms or notice,  mass FRT deployment in public spaces destroys anonymity and freedom of movement.

III. CONSTITUTIONAL AND LEGAL FRAMEWORK IN INDIA

1. Constitutional Protections

Article 21 safeguards right to privacy as essential part of personal liberty.
Facial recognition without law infringes dignity,  liberty,  and informational autonomy of individuals[21].  Article 14 guarantees equal protection,  which facial profiling often violates.
Algorithmic bias against marginalised groups results in unequal treatment and arbitrary classifications[22].  Articles 19(1)(a) and 19(1)(b) protect freedom of speech and assembly,  both threatened by live FRT.  Chilling effect emerges when citizens self-censor fearing surveillance in public demonstrations[23].

2. Puttaswamy Judgment and Proportionality Doctrine

The Supreme Court in Justice K.S.  Puttaswamy v.  Union of India held privacy as intrinsic to Article 21[24].  Any invasion of privacy must satisfy legality,  necessity,  and proportionality standards.  The legality prong mandates explicit statutory basis,  which many FRT uses currently lack.  Necessity requires demonstration of indispensability and absence of less-intrusive alternatives.  Proportionality ensures harm to privacy is not excessive relative to state’s objective. Procedural safeguards such as audits,  data minimisation,  and redress mechanisms are essential.  FRT,  when used without proper legislative procedure,  risks breaching Puttaswamy proportionality test[25].

3. Statutory Framework

The Information Technology Act,  2000 is inadequate for biometric mass surveillance.
Section 43A offers limited redress only in cases involving negligent handling of personal data[26].  The Digital Personal Data Protection Act,  2023 introduces obligations on data fiduciaries.  However,  government exemptions under Section 17 permit wide discretion without meaningful safeguards[27].  The Act lacks specificity in handling real-time biometric surveillance or live FRT systems.  Consent requirements are circumvented in ‘necessary’ state functions without independent scrutiny.  No clarity exists on FRT’s status as “sensitive personal data” under IT Rules,  2011.  Live deployments like National Automated Facial Recognition System operate without statutory anchoring[28].

4. Judicial Developments and Emerging Cases

The Supreme Court upheld Aadhaar’s constitutional validity in K.S.  Puttaswamy v.  Union of India (Aadhaar) but limited its scope[29].  In Aadhaar,  biometric retention was questioned,  with dissent cautioning against surveillance state architecture.  In Internet Freedom Foundation v.  Union of India,  Delhi High Court sought transparency on Delhi Police’s FRT use[30].  No final binding precedent has yet clarified constitutionality of FRT under current Indian statutes.  Judicial reluctance persists in proactively addressing emerging surveillance infrastructure gaps.  The absence of judicially enforced standards allows continued opacity in FRT deployment.

IV. COMPARATIVE PERSPECTIVES: GLOBAL STANDARDS AND LESSONS FOR INDIA

1. European Union

General Data Protection Regulation (GDPR) classifies biometric data as special category needing explicit consent[31].  Law Enforcement Directive governs processing of personal data by competent law enforcement authorities.  European Data Protection Board (EDPB) 2023 guidelines prohibit indiscriminate biometric surveillance in public places[32].  Guidelines emphasise strict necessity and proportionality,  even for national security measures.  The UK Court of Appeal in Bridges v.  South Wales Police held live FRT use unlawful[33].  Court found violation of Article 8 of European Convention on Human Rights.  Lack of clear guidance,  independent oversight,  and impact assessments made FRT unlawful in Bridges case.

2. United States

The Fourth Amendment protects citizens against unreasonable searches and seizures.
FRT without judicial warrant raises probable cause concerns under Fourth Amendment scrutiny.  Cities like San Francisco and Portland imposed legislative bans on government FRT usage[34].  Federal law lacks comprehensive data privacy or facial recognition regulation.
Patchwork regulation leads to inconsistent standards and civil liberty violations.  First Amendment issues arise due to chilling effect on public assembly and protest surveillance.

3. International Human Rights Framework

European Union Agency for Fundamental Rights (FRA) highlights risk of over-policing minorities[35].  UN Special Rapporteurs raised concerns on bulk biometric data collection and privacy infringements.  International standards demand legality,  legitimacy,  necessity,  proportionality,  and safeguards in surveillance tech use.  Comparative studies recommend transparency,  public consultations,  and democratic oversight as baseline principles.  India lacks independent data protection authority with enforcement powers,  unlike EU regulators.
International trends favour moratoriums on FRT until robust legal frameworks exist.

V. APPLICATION OF PROPORTIONALITY,  NECESSITY AND DUE PROCESS TO FRT IN INDIA

1. Necessity Test

State must show FRT is indispensable and not just convenient for law enforcement.
Use of FRT must not be permitted if less intrusive alternatives like CCTV or fingerprints suffice.  Necessity requires empirical justification,  not speculative or broad generalised assertions.  Absence of data on success rates or accuracy in Indian FRT use undermines necessity claims.

2. Proportionality Analysis

State’s surveillance objective must be balanced against individual’s right to privacy.  Collection of biometric data in bulk without suspicion undermines proportionality.  FRT operates without limitation on time,  purpose,  or scope of data collection.  Disproportionate intrusion becomes evident in mass public deployment of live FRT.  Proportionality also demands independent reviews and transparency of algorithmic criteria.  Lack of impact assessments or audit mechanisms increases chances of arbitrariness.

3. Due Process Requirements

Legislative mandate is fundamental to legitimise invasive technologies like FRT.  Use of executive notifications or departmental circulars does not suffice as “law” under Puttaswamy.  Independent oversight body must authorise and review use of facial recognition systems.  Audit trails,  data logs,  and right to challenge false matches are part of due process.  Victims of false identification must have access to remedies and damages.  Real-time surveillance must adhere to constitutional morality and democratic accountability.

VI. IMPACT ON VULNERABLE COMMUNITIES AND EQUALITY CONCERNS

1. Discriminatory Outcomes

Facial recognition datasets often underrepresent darker skin tones and non-male features[36].
Resultant misidentification disproportionately affects women,  transgender persons,  and ethnic minorities.  Cases from US reveal Black men falsely arrested due to flawed FRT matches[37].  Bias in training datasets perpetuates systemic discrimination in algorithmic policing.
In India,  marginalised groups are more likely to face surveillance without consent or oversight.

2. Constitutional Lens under Article 14

Targeted deployment in low-income or protest-prone neighbourhoods creates structural inequality.  Equal protection is breached when surveillance burdens fall unequally on certain communities.  FRT presence during protests discourages exercise of democratic rights and civil participation.  State must justify how selective deployment does not lead to discriminatory profiling.

3. Empirical Insights

US National Institute of Standards and Technology found higher error rates for non-white faces[38].  UK’s Big Brother Watch noted 81% inaccuracy in live FRT matches by police.  Such figures question reliability and legality of Indian police adoption without audit data.  In India,  lack of transparency and demographic impact studies remain serious concern.  Global lessons warn against FRT expansion without accountability,  especially in diverse democracies.

VII. REGULATORY AND OVERSIGHT MECHANISMS FOR INDIA

The Information Technology Act,  2000 does not provide tailored norms for biometric-based surveillance or automated facial recognition deployments by State actors or private entities[39].  The Digital Personal Data Protection Act,  2023 regulates data fiduciaries,  but leaves critical gaps in real-time biometric surveillance and live facial recognition systems[40].  No provision mandates differential safeguards based on the sensitivity of data or deployment in public vs. private spheres under the DPDP Act.  There is no explicit mention of live facial recognition systems in either statute,  despite its distinct operational risks and constitutional implications.
India lacks any unified statutory definition of facial recognition technology,  nor does it distinguish between identification from verification models operationally.

Judicial oversight remains weak in matters of surveillance,  especially where agencies act under executive direction without legislative mandate[41].  There is no prior independent approval system or judicial review of FRT usage,  violating the constitutional safeguards laid in Puttaswamy[42].  There is an urgent need to establish independent oversight institutions like FRT Commissions with statutory authority and cross-functional expertise.  Mandatory Data Protection Impact Assessments must precede any deployment of facial recognition projects by public or private entities.  Such assessments should be made public,  contain algorithmic fairness audits,  and indicate proportionality in accordance with constitutional jurisprudence.

Users whose biometric data is collected must be clearly informed,  and their consent must be purpose-specific and freely revocable.  Provisions for grievance redressal against wrongful identification or algorithmic misclassification must be institutionalised by law.  Users must have clear rights to access,  correct,  and delete their facial recognition data under a functional rights-based framework.  Wrongful storage,  sharing,  or profiling without a basis must be made punishable with strong statutory penalties to deter surveillance misuse.  The accountability of both data fiduciaries and government departments must be ensured through routine audits and transparent reporting obligations.

VIII. RECOMMENDATIONS AND THE WAY FORWARD

India must draft a dedicated law regulating facial recognition technology,  distinct from the DPDP Act or general IT surveillance norms.  Such a law must define facial recognition,  distinguish uses,  and provide clear boundaries on deployment,  retention,  and access.  This law must incorporate global standards from the EU’s GDPR and Law Enforcement Directive,  and guidance from EDPB’s 2023 opinion on FRT[43].  It should also reflect regulatory learnings from city-level bans in the US and community opposition to opaque deployments[44].

All future litigations on FRT must be adjudicated using the four-pronged test laid down in Justice K.S.  Puttaswamy v.  Union of India[45].  Each deployment must pass tests of legality,  necessity,  proportionality,  and procedural safeguards through judicial application.  The Supreme Court must review existing state-level deployments such as the National Automated Facial Recognition System.  Schemes without parliamentary backing or audit mechanisms must be stayed pending constitutional scrutiny.

Data protection should be embedded into the design frameworks of FRT systems, including anonymisation,  encryption, and minimal retention periods.  Independent audits of AI systems used in FRT must become mandatory, with publicly available reports on accuracy and bias metrics.  Public consultations and stakeholder engagement with civil society,  academia,  and technologists must precede policy formation.  A multi-stakeholder regulatory council must be constituted to monitor,  evaluate,  and recommend dynamic FRT regulations periodically.

IX. CONCLUSION

Facial recognition intersects several fundamental rights, including privacy,  equality,  expression,  and due process.  India’s constitutional jurisprudence demands that any restrictions on these rights be backed by law and subject to necessity.  Current legislative and regulatory frameworks inadequately address the specificity and intrusiveness of facial recognition.  Comparative insights from EU,  US,  and UN bodies suggest a cautious and rights-based approach to this emerging technology.  The risks of mass surveillance and wrongful targeting far outweigh efficiency benefits if deployed without legal safeguards.  India must urgently enact statutory regulations,  judicial mechanisms,  and institutional safeguards tailored to facial recognition. Legal reforms must ensure that constitutional values are upheld even as technological innovation expands into biometric governance.

REFERENCES

1.European Union Agency for Fundamental Rights, Facial Recognition Technology: Fundamental Rights Considerations (FRA 2019).

2. Ameen Jauhar, ‘Indian Law Enforcement’s Ongoing Usage of Automated Facial Recognition Technology – Ethical Risks and Legal Challenges’ (Vidhi Centre for Legal Policy, 2021) https://vidhilegalpolicy.in/research/indian-law-enforcements-ongoing-usage-of-automated-facial-recognition-technology-ethical-risks-and-legal-challenges/ accessed 8 September 2025.

3. Venkitesh MJ, ‘Biometric Facial Recognition Technology Through Lens of Right to Privacy’ (2024) 6(3) International Journal of Legal Science and Innovation

4. Justice KS Puttaswamy (Retd) v Union of India (2017) 10 SCC 1.

5. European Data Protection Board, ‘Guidelines 05/2022 on Facial Recognition in Law Enforcement’ (Version 2.0, 26 April 2023) https://edpb.europa.eu accessed 8 September 2025.

6. UN Office of the High Commissioner for Human Rights, ‘The Right to Privacy in the Digital Age’ UN Doc A/HRC/48/31 (2021).

7. European Union Agency for Fundamental Rights, Facial Recognition Technology: Fundamental Rights Considerations in the Context of Law Enforcement (FRA Focus Paper, 2019).

8. European Parliament, ‘Resolution of 6 October 2021 on Artificial Intelligence in Criminal Law and Its Use by the Police and Judicial Authorities in Criminal Matters’ [2020/2016(INI)].

9. R (Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058.

10. Digital Personal Data Protection Act 2023 (India).

11. Irena Nesterova, ‘Mass Data Gathering and Surveillance: Fight Against Facial Recognition Technology in Globalised World’ (2020) SHS Web of Conferences 74, 03006.

12. Michael O’Flaherty, ‘Facial Recognition Technology and Fundamental Rights’ (2020) 6(2) European Data Protection Law Review

13. Mark Simonitis, ‘Facial Recognition Technology and Constitution’ (2021) 2(2) Notre Dame Journal on Emerging Technologies

14. Priya Vedavalli and others, ‘Facial Recognition Technology in Law Enforcement in India: Concerns and Solutions’ (Data Governance Network Working Paper 16, 2021) https://datagovernance.org/report/facial-recognition-technology-in-law-enforcement-in-india-concerns-and-solutions accessed 8 September 2025.

15. Joy Buolamwini and Timnit Gebru, ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’ (2018) 81 Proceedings of Machine Learning Research

16. Anushka Jain, ‘The Chilling Effect of Facial Recognition Surveillance’ (Internet Freedom Foundation, 2021) https://internetfreedom.in/the-chilling-effect-of-facial-recognition-surveillance/ accessed 8 September 2025.

17. Information Technology Act 2000 (India) s 43A.

18. Digital Personal Data Protection Act 2023 (India) s 17.

19. Internet Freedom Foundation, ‘Project Panoptic: Surveillance Mapping in India’ (2022) https://internetfreedom.in/panoptic/ accessed 8 September 2025.

20. S. Puttaswamy v Union of India (Aadhaar) (2019) 1 SCC 1.

21. Internet Freedom Foundation v Union of IndiaP.(C) No 1072/2020 (Delhi HC).

22. Regulation (EU) 2016/679 (General Data Protection Regulation), art 9.

23. Clare Garvie, The Perpetual Line-Up: Unregulated Police Face Recognition in America (Georgetown Law Center on Privacy & Technology, 2016) https://www.perpetuallineup.org/ accessed 8 September 2025.

24. Kashmir Hill, ‘Wrongfully Accused by an Algorithm’ The New York Times (24 June 2020) https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html accessed 8 September 2025.

25. Patrick Grother, Mei Ngan and Kayee Hanaoka, ‘Face Recognition Vendor Test (FRVT): Part 3 – Demographic Effects’ (National Institute of Standards and Technology, 2019)https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf accessed 8 September 2025.

26. PUCL v Union of India (1997) 1 SCC 301.

27. Clare Garvie, ‘Garbage In, Garbage Out: Face Recognition on Flawed Data’ (Georgetown Law Center on Privacy & Technology, 2019) https://www.flaweddata.com/ accessed 8 September 2025.

Notes

[1] European Union Agency for Fundamental Rights,  Facial Recognition Technology: Fundamental Rights Considerations (FRA,  2019) 3.

[2] Ameen Jauhar,  ‘Indian Law Enforcement’s Ongoing Usage of Automated Facial Recognition Technology – Ethical Risks and Legal Challenges’ (Vidhi Centre,  2021).

[3] Venkitesh M J,  ‘Biometric Facial Recognition Technology Through Lens of Right to Privacy’ (2024) 6(3) IJLSI 828.

[4] Justice K S Puttaswamy v Union of India (2017) 10 SCC 1.

[5] ibid [647] (Chandrachud J).

[6] European Data Protection Board,  ‘Guidelines 05/2022 on Facial Recognition in Law Enforcement’ (2023) 5–6.

[7] FRA,  Facial Recognition Technology: Fundamental Rights Considerations (n 1) 9.

[8] UN OHCHR,  ‘The Right to Privacy in Digital Age’ A/HRC/48/31 (2021).

[9] European Union Agency for Fundamental Rights,  Facial recognition technology: fundamental rights considerations in context of law enforcement (Focus Paper,  2019).

[10] European Data Protection Board,  Guidelines 05/2022 on use of facial recognition technology in area of law enforcement (Version 2.0,  26 April 2023).

[11] European Parliament resolution of 6 October 2021 on artificial intelligence in criminal law and its use by police and judicial authorities in criminal matters (2020/2016(INI)).

[12] R (Bridges) v Chief Constable of South Wales Police [2020] EWCA Civ 1058.

[13] Justice K.S.  Puttaswamy (Retd.) v Union of India (2017) 10 SCC 1.

[14] Digital Personal Data Protection Act 2023 (India).

[15] M.J.  Venkitesh,  ‘Biometric Facial Recognition Technology through Lens of Right to Privacy’ (2024) 6(3) International Journal of Legal Science and Innovation 828.

[16] Irena Nesterova,  ‘Mass data gathering and surveillance: fight against facial recognition technology in globalized world’ (2020) SHS Web of Conferences 74,  03006.

[17] Michael O’Flaherty,  ‘Facial Recognition Technology and Fundamental Rights’ (2020) 6(2) European Data Protection Law Review 170.

[18] Mark Simonitis,  ‘Facial Recognition Technology and Constitution’ (2021) 2(2) Notre Dame Journal on Emerging Technologies 357.

[19] Priya Vedavalli et al,  ‘Facial Recognition Technology in Law Enforcement in India: Concerns and Solutions’ (2021) Data Governance Network Working Paper 16.

[20] Joy Buolamwini and Timnit Gebru,  ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’ (2018) 81 Proceedings of Machine Learning Research.

[21] Justice K.S.  Puttaswamy v Union of India (2017) 10 SCC 1

[22] ibid

[23] Anushka Jain,  ‘The Chilling Effect of Facial Recognition Surveillance’ (Internet Freedom Foundation,  2021)

[24] Justice K.S.  Puttaswamy v Union of India (2017) 10 SCC 1

[25] ibid

[26] Information Technology Act 2000,  s 43A

[27] Digital Personal Data Protection Act 2023,  s 17

[28] Internet Freedom Foundation,  ‘Project Panoptic: Surveillance Mapping in India’ (2022)

[29] K.S.  Puttaswamy v Union of India (Aadhaar),  (2019) 1 SCC 1

[30] Internet Freedom Foundation v Union of India,  W.P.(C) 1072/2020 (Delhi HC)

[31] Regulation (EU) 2016/679 (General Data Protection Regulation),  art 9

[32] EDPB,  ‘Guidelines 05/2022 on Use of Facial Recognition Technology in Area of Law Enforcement’ (2023)

[33] Bridges v South Wales Police [2020] EWCA Civ 1058

[34] Clare Garvie,  ‘The Perpetual Line-Up’ (Georgetown Law,  2016)

[35] FRA,  ‘Facial Recognition Technology: Fundamental Rights Considerations’ (2019)

[36] Joy Buolamwini and Timnit Gebru,  ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’ (2018)

[37] Kashmir Hill,  ‘Wrongfully Accused by Algorithm’ New York Times (24 June 2020)

[38] Patrick Grother et al,  ‘Face Recognition Vendor Test (FRVT)’ (NIST 2019)

[39] Information Technology Act 2000 (India).

[40] Digital Personal Data Protection Act 2023 (India).

[41] PUCL v Union of India (1997) 1 SCC 301.

[42] Justice K S Puttaswamy v Union of India (2017) 10 SCC 1.

[43] European Data Protection Board,  ‘Guidelines 05/2023 on Facial Recognition Technology in Law Enforcement’ (2023).

[44] Clare Garvie,  ‘Garbage In,  Garbage Out: Face Recognition on Flawed Data’ (Georgetown Law Center on Privacy & Technology,  2019).

[45] Justice K S Puttaswamy v Union of India (2017) 10 SCC 1.

Join Taxguru’s Network for Latest updates on Income Tax, GST, Company Law, Corporate Laws and other related subjects.

Leave a Comment

Your email address will not be published. Required fields are marked *

Ads Free tax News and Updates
Search Post by Date
April 2026
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
27282930