India DPDP Act + RBI AI/ML Guidelines
Coverage of India's Digital Personal Data Protection Act 2023, RBI's framework for responsible AI/ML in financial services, and MeitY's advisory on AI governance. Essential for auditing AI systems in the Indian regulatory context.
4.1 — Digital Personal Data Protection Act 2023 (DPDP Act)
The DPDP Act 2023 received Presidential assent in August 2023. It establishes a comprehensive framework for processing digital personal data in India, based on the principles of consent, purpose limitation, data minimization, accuracy, storage limitation, and accountability.
Data Fiduciary (determines purpose and means of processing — equivalent to GDPR 'controller'), Data Processor (processes on behalf of fiduciary), Data Principal (the individual whose data is processed — equivalent to GDPR 'data subject').
Consent requirements: Processing requires free, specific, informed, unconditional, and unambiguous consent with clear affirmative action. Consent must be as easy to withdraw as to give. 'Legitimate uses' allow processing without consent in specific cases (government services, medical emergencies, employment).
Obtain information about what personal data is being processed and how.
Request correction of inaccurate data or erasure of data no longer needed.
File complaints with the Data Fiduciary, and escalate to the Data Protection Board of India (DPBI).
Nominate another person to exercise rights on behalf of the Data Principal (e.g., in case of death or incapacity).
SDFs are designated by the government based on volume/sensitivity of data processed. SDFs must: (1) appoint a Data Protection Officer (DPO) based in India, (2) conduct periodic Data Protection Impact Assessments (DPIAs), and (3) undergo independent audits. Know the three SDF obligations for the exam.
| Violation | Maximum Penalty |
|---|---|
| Non-compliance with general obligations | Up to ₹50 crore (~$6M) |
| Failure to protect against data breach | Up to ₹250 crore (~$30M) |
| Violation of children's data provisions | Up to ₹200 crore (~$24M) |
| Non-compliance by Data Processor | Up to ₹50 crore (~$6M) |
| Violation of additional SDF obligations | Up to ₹150 crore (~$18M) |
| Data Principal breach of duties | Up to ₹10,000 |
4.2 — DPDP Act and AI Systems
AI systems that process personal data fall squarely under the DPDP Act. This includes training data, inference inputs, and outputs that contain or derive personal information. Consent requirements apply to data collection for AI training.
The DPDP Act uses a 'blacklist' approach for cross-border transfers (allowed except to restricted countries), while GDPR uses a 'whitelist' approach (restricted except to adequate countries). This is a frequently tested distinction.
Children's data: Processing of children's data (under 18) requires verifiable parental consent. Targeted advertising and tracking of children are prohibited. AI systems used in educational contexts must comply with these requirements.
India's children's age threshold is 18 — higher than GDPR's 16 (or 13 in some member states). Any AI system processing data of persons under 18 in India triggers enhanced consent requirements.
4.3 — RBI Guidelines on AI/ML in Financial Services
The Reserve Bank of India has issued guidance on responsible use of AI/ML in financial services, covering credit scoring, fraud detection, customer service chatbots, and algorithmic trading. Banks and NBFCs must ensure AI systems are fair, transparent, and accountable.
Establish comprehensive governance covering all AI/ML models used in banking operations.
High-impact AI models must be validated by an independent team not involved in development.
AI-driven credit decisions must provide specific, actionable rejection reasons — not opaque 'AI-decided' responses.
Payment system data must be stored exclusively in India. AI processing payment data must ensure residency compliance.
Banks using third-party AI remain fully responsible. Due diligence, contractual safeguards, and monitoring are mandatory.
Under RBI's data localization mandate, ALL payment system data must be stored exclusively in India. This applies to AI systems processing payment data, including those using cloud-hosted ML models. Non-compliance can result in loss of payment system authorization.
An NBFC uses a third-party ML model for loan underwriting. Under RBI guidelines, the NBFC must: (1) validate the model independently, (2) ensure rejection reasons are explainable to applicants, (3) verify the vendor stores data in India, and (4) maintain full documentation of the model's logic and limitations.
4.4 — MeitY and Emerging Indian AI Governance
India's approach to AI regulation is evolving rapidly. Unlike the EU's comprehensive legislation approach, India currently favors sector-specific regulation combined with voluntary frameworks.
| Regulator | Sector | Key AI Focus |
|---|---|---|
| RBI | Banking & Finance | Model risk management, explainability, data localization |
| SEBI | Capital Markets | Algorithmic trading, AI-driven investment advice |
| IRDAI | Insurance | AI in underwriting, claims processing |
| MeitY | Cross-sector | General AI governance, platform approvals |
| NITI Aayog | Policy | Responsible AI principles (non-binding) |
India's regulatory approach is sector-specific (RBI for banking, SEBI for capital markets) rather than comprehensive like the EU AI Act. Expect questions comparing these two regulatory approaches.
Show Answer
Data Fiduciary (determines purpose/means — GDPR 'controller'), Data Processor (processes on behalf of fiduciary — same in GDPR), Data Principal (individual whose data is processed — GDPR 'data subject').
Show Answer
Must appoint a Data Protection Officer based in India, conduct periodic Data Protection Impact Assessments, and undergo independent audits.
Show Answer
AI-driven lending decisions must be explainable to customers with specific, actionable rejection reasons. Opaque 'AI-decided' responses are not acceptable. Independent model validation is required for high-impact models.
Show Answer
DPDP Act uses a 'blacklist' approach — transfers are allowed to all countries except those specifically restricted by the government. GDPR uses a 'whitelist' approach — transfers are restricted unless the destination country has an adequacy decision or appropriate safeguards are in place.
Show Answer
The maximum penalty is ₹250 crore (~$30M) per instance, applicable for failure to take reasonable security safeguards to prevent a data breach.
Show Answer
India favors sector-specific regulation (RBI for banking, SEBI for capital markets, etc.) combined with voluntary frameworks, while the EU adopted a comprehensive, cross-sector legislative approach through the EU AI Act. India does not currently have a single comprehensive AI law.
Show Answer
Processing children's data (under 18 — higher threshold than GDPR's 16) requires verifiable parental consent. Targeted advertising and behavioral tracking of children are prohibited. AI systems in educational contexts must comply.