By — Srisoniya Subramoniam
Abstract
The Digital Personal Data Protection Rules, 2025 is a comprehensive framework on data protection and mark a significant step in Indian legal history. Recent developments in the digital field have exacerbated the necessity for such frameworks. But the most vulnerable amongst them are the children and the disabled groups, who have the least authority. While the Rules are normatively strong, they mandate parent consent, prohibit behavioural tracking, and targeted advertising and therefore recognise the need for a strong guardianship for both the children and persons with disabilities. But the framework for such classes of persons raises serious concerns regarding autonomy, dignity, and practical enforceability. This article appraises the DPDP Rules for its special protection of the vulnerable groups but argues that the framework favours a paternalistic risk-avoidance model over a rights-based, empowerment-oriented model of data protection. Therefore, this article aims to examine the structural gaps and oversights in the creation and employment of the framework of the DPDP Act for the protection of data of the vulnerable persons – the very groups the Rules seek to protect.
Introduction
Digital governance increasingly requires the law to mediate between protection and participation. In no area is this tension more visible than in the regulation of children’s and vulnerable persons’ data. As digital platforms become central to education, socialisation, healthcare, and civic engagement, data protection frameworks must navigate a delicate balance between shielding vulnerable individuals from exploitation while preserving their autonomy and agency in digital spaces.
India’s Digital Personal Data Protection Rules, 2025 (hereinafter the ‘DPDP Rules’) are an appreciable effort to address this challenge by providing special provisions for children and personal disabilities to protect them in this digital age. The current provisions are viewed as a progressive advancement in India’s privacy protection regime. It reflects a consolidation of the global ideals on surveillance standards, targeted advertising, and algorithmic and digital harm. But unfortunately, a closer look reveals that these provisions are rooted less in empowerment and more in precautionary paternalism.
This article argues that while the DPDP Rules rightly recognise vulnerability as a relevant regulatory category, they conceptualise it in overly broad and static terms. In doing so, the Rules risk infantilising adolescents, over-securitising disability, and imposing compliance burdens that may paradoxically intensify data collection and surveillance. The discussion proceeds by analysing, first, the framework governing children’s data and, second, the regime applicable to persons with disabilities and assisted consent.
Children as Data Subjects: Protection Without Autonomy?
Children are to be examined first as data protection regimes across jurisdictions consider them as primary subjects of concern, who need to be governed and managed. This additional care for a particular category of individuals results in extensive and restrictive regulations which might be absent for other categories.
A. The Blanket Age Threshold and the Denial of Evolving Capacity
The DPDP Rules treat all individuals below the age of eighteen as incapable of providing valid consent for the processing of their personal data, mandating verifiable parental consent in every case. This seems to be a very simple administrative protocol, but it is incapable of accounting for the evolution of children and their capacities over time. It arbitrarily categorises all individuals below the age of 18 years as incapable of providing consent, while adolescents, a big portion of the said age period, have been exercising independent agency online. While guidance is required for all under the said age, it is completely invalid to govern each of their actions. The provision outlined in the DPDP Act entirely collapses the distinction between a child of six-years and a teenager of seventeen-years; this one-size-fits-all policy of vulnerability is suitable for the current generation and requires serious reforms. The current policy is furthermore contradictory to the existing international human rights frameworks, which value children’s decision-making capability and its progression over time. It is considered vital to allow a child their freedom, so that it makes them capable and independent, which is entirely rendered invalid by the DPDP Act, which is further to say, fuelled by the Indian social and cultural values. The result is a regulatory structure that prioritises formal protection over lived digital realities, potentially excluding teenagers from meaningful participation in online educational platforms, mental health resources, and peer communities. More importantly, this approach takes away the rights of the children and portrays them as objects of parental control in the digital sphere.
B. Verifiable Parental Consent and the Privacy Paradox
The requirement of “verifiable parental consent” is presented as a safeguard against exploitative data practices. But one does not envision the flaws in this system. It is to be noted that the Rules remain predominantly silent on the methods of verification, the technologies employed, and the data minimisation standards during the entire consent process. This ambiguity might lead to the protection of children’s privacy at the cost of additional data collection in the form of intrusive identity verification, biometric authentication, additional phone numbers, emails or other logins, document uploads – each of the new information added to the database might result in an additional privacy and security risk. Therefore, one might even say that in an attempt to shield the children, the current framework might inadvertently expand the surveillance system. Additionally, such parental checks and consent-centric models function on the assumption that the parents are literate, cautious, and diligent. It disregards scenarios where the parents might be neglectful, coercive, or even just lack awareness, which makes parental consent a procedural fiction rather than a meaningful safeguard.
C. Ban on Behavioural Tracking and Targeted Advertising
The DPDP Rules are very stringent on the prohibition on behavioural tracking, profiling, and targeted advertising directed at children, which seems to be one of its strongest features. This is a clear indication of a no-tolerance policy against exploitative data-driven practices and business enterprises which target minors. But again, enforcement in this regard is quite ambiguous. The high walls built around children’s usage, such as age verifications and opaque algorithmic systems, make it difficult to ensure that children do not indirectly succumb to such targeting. Without robust oversight mechanisms, the prohibition risks becoming aspirational rather than effective.
Persons with Disabilities and Assisted Consent: Between Recognition and Reductionism
The structural inconsistencies that persist for disabled persons are similar to those that exist in the case of children. Both categories of individuals are often forced to succumb to other individuals’ wills and decisions.
A. Guardianship and the Problem of Substituted Decision-Making
The DPDP Rules are particularly targeted to provide special protection to persons with disabilities so that they are on par with other individuals, by allowing consent to be provided by legal guardians on their behalf. But does this special provision truly benefit the person for whom it is intended? While the provision very progressively acknowledges the requirement of support for such classes of persons, it fails to provide such support suitably. The current provision simply substitutes the inability to provide consent with substituted consent. The UN Convention on the Rights of Persons with Disabilities (UNCRPD) highlights the importance of supported decision-making and distinguishes between assistance provided for decision-making and decisions being made for them. Unfortunately, the DPDP Rules in the current form seem to offer less guidance on how consent can be facilitated without being overridden. This omission can be problematic in digital contexts where consent is predominantly recurring. Finally, treating persons with disabilities as entirely and uniformly incapable of consent undermines their capacity and dignity, which may be counterproductive to the values of the provisions.
B. Accessibility and Grievance Redressal
The Rules mandate the provision of grievance redressal mechanisms, but do not sufficiently allow for accessibility. Persons with disabilities, which may amount to but are not confined to cognitive, sensory, or psychological disabilities, may restrict access to remedies. In order for the redressal mechanism to be effective in implementation, a rights-based regime would be more apt, which would require accessible consent notices, simplified withdrawal mechanisms, and assisted grievance processes. The absence of such measures suggests that vulnerability is acknowledged in theory but inadequately addressed in institutional design.
Conclusion
The DPDP Rules, 2025, are an overdue attempt to protect vulnerable groups in India’s rapidly changing digital landscape. However, the framework risks entrenching paternalism at the cost of participation and dignity. The provisions specified for both children and disabled persons view vulnerability as a risk rather than a condition warranting empowerment and rights. Consequently, such a risk-centric framework might not be the most apt foreground for any data protection framework for India. A truly progressive data protection regime should essentially move beyond age restrictions and substituted consent models, and enforce principles of agency, authority, and supported decision-making along with accessibility-by-design. Such recalibrations are necessary to allow DPDP Rules to move beyond merely insulating platforms from liabilities as a part of a protectionist regime and truly empower those at most risk in the evolving digital age. The strength of India’s data protection framework will be measured by not the restriction placed on the vulnerable groups, but rather the opportunity provided to them to exercise their rights – safely, meaningfully, and autonomously – within the digital public space.
About the Author
Srisoniya Subramoniam is a third-year B.A. L.L.B. (Hons.) student at Jindal Global Law School (JGLS) with a keen academic interest in legal scholarship, especially Intellectual Property Rights, Artificial Intelligence, International Law, and Criminal Litigation.
Image Source: https://share.google/XfDPJdzyqbsyjqvVa

