For startups in the ed-tech space and digital services sector catering to children, protecting personal data has never been more critical. Children are a vulnerable demographic in the digital ecosystem, often lacking awareness of privacy risks, which puts a heightened responsibility on businesses. India’s regulatory landscape has evolved to address these concerns, with the introduction of the Digital Personal Data Protection Act, 2023 (“DPDP Act” or “Act”) and the draft Digital Personal Data Protection Rules, 2025 (“Rules”). These regulations not only enforce strict measures for handling children’s data but also urge startups to rethink and redesign their data protection frameworks to align with the principles of safety, accountability, and transparency.

A key focus of the Act is the protection of children, defined as individuals under the age of 18. Where the data principal is a child, their lawful guardian is also included in the scope of this definition. As per the Act, wherever an entity is processing any personal data of a child, it shall obtain verifiable consent of the parent or lawful guardian of such child. The Act explicitly prohibits the processing of children’s personal data if it is likely to adversely affect their well-being. 

However, this standard is broad and subject to interpretation, raising critical questions about what constitutes harm to a child’s well-being. Determining such effects may require a case-by-case evaluation, adding ambiguity for businesses. This lack of clarity underscores the need for startups to implement stringent mechanisms and robust safety protocols to mitigate risks and ensure compliance. Further, the Act also prohibits entities from undertaking any tracking or behavioural monitoring of children or running any targeted advertising directed at children. In case of any breach of obligations on entities with respect of children’s personal data, a fine up to Rs. 200 Crores may be levied. This substantial penalty poses a serious risk to businesses, potentially wiping out company’s entire share capital. It’s not a matter to be taken lightly!

In light of the above developments, certain key questions need to be answered and the most significant being, the process of obtaining parental consent. Many businesses view the consent mechanism as a hurdle to user onboarding. Moreover, when parental consent is required, it adds another layer of compliance that companies, eager to attract users, may be reluctant to adopt.

According to the draft Rules, entities are responsible for verifying that the individual claiming to be a parent is indeed an adult and can provide “reliable” identity and age details. But what qualifies as “reliable” in this context? The Rules suggest that entities can rely on information already available to them regarding the adult or voluntarily provided by such individuals or via use of virtual tokens issued by government authorities (such as digital locker services). This means businesses must not only obtain parental consent but also implement mechanisms to verify that the person giving consent is an adult and provide sufficient identification proof to confirm their identity. From a platform design perspective, a simple “tick-box” will not suffice, as the Rules mandate the authentication of reliable government IDs. This requires integrating a mechanism into the platform to both collect and verify such IDs.

Interestingly, neither the Act nor the Rules require platforms to verify the age of their users, relying instead on self-declaration. Given that the law is still in its infancy, it is premature to predict how courts may interpret these provisions. Nonetheless, the requirement for entities to exercise “due diligence” suggests that startups must proactively implement measures to demonstrate their commitment to compliance and the protection of children’s data.

The Rules propose exemptions for specific entities processing children’s data under certain conditions. These include clinical establishments, mental health institutions, or healthcare professionals processing data solely to provide health services for safeguarding children’s health. Educational institutions, creches, and child day-care centres are permitted to process data for tracking and behavioural monitoring, limited to educational activities and ensuring child safety. Similarly, transport facilities entrusted with children’s transportation can process location data to enhance safety during travel.

Additionally, exemptions are provided for certain purposes in the interest of children. These include exercising powers, performing functions, or discharging duties under any law, delivering state-provided benefits or services, and creating email accounts for communication. Other exempted purposes involve ensuring children are not exposed to harmful information and verifying if a Data Principal is a child, thereby ensuring compliance with due diligence requirements.

As digital platforms increasingly cater to children and their families, ensuring compliance with data protection laws is no longer optional but essential. Edtech companies and startups offering child-focused services bear the responsibility to create a safe digital environment by prioritizing data security and adopting robust compliance mechanisms. Beyond legal obligations, fostering trust among users is vital for business success in this sector. By embedding privacy-by-design principles, implementing stringent age verification processes, and maintaining transparency in data handling, startups can not only align with the law but also set a benchmark for ethical practices in the industry. The security of children’s data must remain a central focus for businesses, reflecting a commitment to safeguarding their rights and well-being in the digital age.

Post a comment

Your email address will not be published.

Related Posts