Student Data Privacy in AI: What Indian Schools Must Demand from AI Vendors in 2025
India's Digital Personal Data Protection Act 2023 has been in force for over a year. Yet most AI vendors selling to Indian schools have not updated their contracts, data practices, or consent mechanisms to comply with it. This is not a minor compliance gap — it is a significant institutional liability for every school that deploys a non-compliant AI tool. This guide tells you exactly what the law requires and exactly what to demand from every AI vendor.
What the DPDP Act Requires for Children's Data
India's Digital Personal Data Protection Act 2023 is not a general privacy law with a children's footnote. Section 9 of the Act creates specific, heightened requirements for the processing of children's personal data that go materially beyond what is required for adult data.
The core requirement is verifiable parental consent. Before any personal data of a child under 18 is processed, the data fiduciary — in most school AI deployments, this means both the school and the AI vendor — must obtain consent from the child's parent or lawful guardian. The word "verifiable" matters. A child clicking a consent box is not verifiable parental consent. The parent must independently consent, and the mechanism for obtaining that consent must be designed to verify that the consenting person is actually the parent.
The Act also prohibits tracking and behavioural advertising targeting children, prohibits processing that could cause detrimental effects on children's well-being, and gives parents the right to access and request deletion of their child's data at any time. Deletion requests must be honoured — not deferred to a quarterly data purge cycle.
Schools that deploy AI tools without ensuring these requirements are met are not simply using a non-compliant vendor — they are themselves acting as data fiduciaries who are responsible for ensuring compliance. The school cannot outsource its DPDP obligations to the vendor. It is responsible for verifying compliance.
Eight Questions Every School Must Ask an AI Vendor
These questions should be asked in writing, with written responses documented before any deployment agreement is signed. A vendor that cannot answer them clearly is not ready for deployment in India's regulated environment.
- DPDP compliance and DPA availability.Are you compliant with India's Digital Personal Data Protection Act 2023, specifically Section 9 on children's data? Can you provide a signed Data Processing Agreement before any student accounts are created?
- Data storage location. Where exactly is student data stored? Is any of it processed outside of India, and if so, under what cross-border data transfer mechanism?
- Model training.Is student interaction data used to train or improve your AI models? Can schools and parents opt out? Is there a contractual commitment not to use children's data for commercial model training?
- Data deletion on contract termination.What happens to all student data when the school's contract ends? What is the documented deletion timeline and how is deletion confirmed?
- Parental consent mechanism. How exactly is parental consent obtained and verified? Can you show us the consent flow? How do you confirm the consenting person is the parent and not the child?
- Data access for parents. How can a parent access the data held about their child? How can a parent request deletion mid-contract? What is the response timeline?
- Third-party sub-processors. Which third-party services does your platform use that process student data? Are they named in your DPA? Are they bound by the same DPDP obligations?
- Breach notification. In the event of a data breach affecting student data, what is your notification timeline to the school and to affected parents?
What a Proper Data Processing Agreement Should Include
A DPA is not a terms of service document. It is a specific contract that governs the processing of personal data on behalf of another party. A proper DPA for a school AI deployment should include, at minimum: the categories of personal data processed and the purposes for which they are processed; a commitment to DPDP Act Section 9 compliance; data storage location and cross-border transfer restrictions; the list of approved sub-processors; security measures in place to protect student data; the process for honouring parental data access and deletion requests; the process for notifying the school of a data breach; and the data retention and deletion schedule on contract termination.
A DPA that does not cover all of these points is incomplete. A vendor that presents a general privacy policy as a substitute for a DPA is either not ready for institutional deployment or is hoping the school will not notice the difference.
The BYJU's Lesson on Data Risk
The collapse of BYJU's is the most important data governance lesson in Indian EdTech history, and it is still not being learned widely enough. When India's largest EdTech company entered insolvency proceedings, millions of Indian students and parents found that their data was effectively trapped in legal limbo — owned by a company in financial distress, with no clear process for data access, deletion, or transfer.
Schools that had integrated BYJU's into their processes without proper DPAs had no contractual mechanism to demand data return or deletion. Parents who wanted their children's data deleted had no recourse. The data simply sat in systems that were no longer being actively managed, with no accountability for its security or disposal.
This is not a hypothetical risk. It happened. And it will happen again with some current AI EdTech companies, because the sector has a high failure rate and most schools have not demanded the contractual protections that would protect them in exactly this scenario. The DPA clause covering data deletion on contract termination is not administrative formality — it is the clause that protects your students if the AI company you chose closes next year.
How Kyloen Approaches DPDP Compliance
Kyloen was built with India's DPDP Act as a design requirement, not a compliance afterthought. All student data is stored within India. The parental consent flow is designed to DPDP Section 9 requirements — parents consent independently, through a verified process, before any child profile is activated. Student interaction data is not used for commercial AI model training.
Schools that deploy Kyloen receive a signed Data Processing Agreement before any accounts are created. The DPA covers all the elements listed above. Parents can access and delete their child's data through the parent dashboard. On contract termination, student data is deleted within a documented timeline, with a deletion confirmation provided to the school.
This is not a competitive claim — it is a baseline. Every AI platform used by Indian children should meet these standards. Schools that accept less are accepting unnecessary institutional risk.
Frequently Asked Questions
What does India's DPDP Act 2023 require specifically for children's data?
The DPDP Act requires verifiable parental consent before processing any child's (under 18) personal data. It prohibits tracking and behavioural advertising targeting children, prohibits processing that could harm children's well-being, and gives parents the right to access and delete their child's data. Section 9 of the Act contains these specific child-focused provisions.
What is a Data Processing Agreement and why does every school need one?
A DPA is a legally binding contract between the school and the AI vendor documenting exactly how student data is processed, stored, secured, and deleted. Without a signed DPA, the school has no legal protection if data is mishandled. Under DPDP, the DPA is the primary mechanism through which the school discharges its data protection obligations when using a third-party vendor.
What happened to student data when BYJU's collapsed, and what does it mean for schools?
When BYJU's collapsed, millions of Indian students' data ended up in legal limbo — owned by a company in insolvency, with no clear deletion or access process. Schools without proper DPAs had no contractual recourse. The lesson: the DPA clause covering data deletion on contract termination is the clause that protects students if the AI company closes.
Do AI companies use student interaction data to train their models?
Some do, some do not. This must be asked in writing before deployment. Under DPDP, using children's personal data to train commercial AI models without parental consent is likely a violation. Schools must ask explicitly and get a written commitment — not just a reference to general terms of service.
How does Kyloen approach DPDP compliance for student data?
All Kyloen student data is stored within India. Parental consent is DPDP Section 9-compliant with independent parent verification. Student data is not used for commercial model training. Schools receive a signed DPA before deployment. Parents can access and delete data through the dashboard. Data is deleted within a documented timeline on contract termination.