DPDP Rules 2025 for engineers: what your product needs to change before May 2027
Six obligations, an 18-month window, and a priority order for getting compliant
The DPDP Rules 2025, notified in November of that year, did something the Act itself could not: they turned abstract obligations into engineering requirements with a deadline. Most B2B SaaS products handle personal data belonging to Indian users, which means six distinct engineering obligations and an 18-month window to meet them.
This is for engineers, not lawyers. It assumes you have read the compliance summaries and want to know which tickets to open.
Why the DPDP Rules 2025 changed the compliance calculus
The Digital Personal Data Protection Act was passed in August 2023. For most engineering teams, the response was: wait and see. The Act without Rules was largely aspirational — it stated rights and obligations but left the how undefined.
The Rules, published in November 2025, closed that gap. They specify how consent must be obtained, what erasure actually requires, and which security controls are mandatory. With them came a phased enforcement timeline:
| Phase | When | What activates |
|---|---|---|
| Immediate | November 2025 | Data Protection Board is operational. Complaints can be filed and adjudicated now. |
| 12 months | November 2026 | Consent manager registration. Significant data fiduciary obligations begin. |
| 18 months | May 2027 | Full functional compliance expected for all Data Fiduciaries. |
If your team has not started engineering for DPDP yet, you are not too late, but the comfortable window is narrowing. Each of the six obligations below requires some lead time for legal review, testing, and deployment.
Consent as a first-class data model
Valid consent under the DPDP Rules must be specific (tied to a named purpose), informed (stated in plain language), freely given (not bundled — you cannot make marketing consent a condition of using the product), and revocable at any time.
That last condition is the one most products get wrong. A checkbox at signup that writes a boolean to the users table does not satisfy revocability. You need to model consent as records, not as a flag.
A minimal schema:
CREATE TABLE consent_records (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id),
purpose TEXT NOT NULL, -- 'marketing', 'analytics', 'product_usage'
notice_version TEXT NOT NULL, -- 'v1.0', 'v2.0'
consented BOOLEAN NOT NULL,
recorded_at TIMESTAMPTZ NOT NULL DEFAULT now(),
revoked_at TIMESTAMPTZ
);
-- Active consent lookup
CREATE INDEX ON consent_records (user_id, purpose)
WHERE revoked_at IS NULL;The invariant to enforce: every query that touches personal data must be justifiable against an active consent record for that purpose. If you cannot answer "which consent covers this?" for any given data access, you have a gap.
The Rules also require consent notices to be available in 8th Schedule languages if a user requests them. For most SaaS teams this is a localisation task, not an engineering blocker, but it belongs on the roadmap.
Data lifecycle: when purpose exhausted means the data goes
The Rules apply data minimisation strictly. Once the purpose for which data was collected is exhausted, personal data must be erased. That covers account closure, contract expiry, and any explicitly stated objective being met. The default retention cap, when you have not defined your own retention period, is three years.
Erasure here means the data is no longer recoverable. Soft-deletes, which flag a record with a deleted_at timestamp but leave the underlying rows in place, do not qualify.
Practically: map every table that holds personal data against the purpose that justifies it. Build a scheduled job that identifies records where the retention window has closed and erases them. Treat this as a data lifecycle event, not just account deletion.
The six mandatory security safeguards
Rule 8 of the DPDP Rules does not leave security controls to interpretation. Six safeguards are mandatory for every Data Fiduciary:
| Safeguard | What it requires | Where most teams have a gap |
|---|---|---|
| Encryption | Personal data encrypted at rest and in transit | Backup files stored without encryption — often the only real gap in an otherwise secure stack |
| Access control | Role-based access with least privilege applied | Engineers hold broad production DB access that should be scoped or read-only |
| Access logging | Logs retained, searchable, and reviewed | Logs exist but are never reviewed or retained past a few days |
| Data backup | Regular, tested backups | Backups run but restore has never been tested in production |
| Breach detection | Tooling to identify unauthorised access | No alerting on large data exports or anomalous query patterns on personal-data tables |
| Breach notification | Notify the Data Protection Board and affected users within 72 hours of identifying a breach | No runbook, no draft notification templates, no clarity on what constitutes a notifiable breach |
Breach notification is the safeguard most teams under-engineer. The 72-hour clock starts from when you identify the breach, not when it happened. You need a documented incident response process: who is the first call, what does the notification to the Data Protection Board say, what do user notifications say, and what distinguishes a notifiable breach from a minor operational incident.
Draft those templates now, while there is no incident. Getting sign-off from legal on breach notification language during an active incident is not a process; it is a crisis.
Building the rights API
The DPDP Act grants data principals three rights: access (a summary of what personal data you hold and for which purposes), erasure (the right to have data deleted), and nomination (the ability to designate someone to exercise these rights on their behalf in case of death or incapacity).
Access and erasure both need an engineering surface. For most products, that means:
- A 'My Data' page that summarises held personal data by category. You do not need to expose raw database rows; a category-level summary is sufficient.
- An erasure request flow that goes beyond account deletion: actual personal data erasure with a defined SLA and a confirmation sent back to the user when complete.
- An internal queue for processing requests, even if you handle them manually at first. The queue gives you auditability.
Nomination is the least-built right in most products. You need a mechanism for users to record a nominee and a process for verifying and actioning their requests. For B2B SaaS whose users are employees, this is a lower-priority edge case, but the right exists and needs to be exercisable.
The Rules do not specify a hard SLA for erasure fulfilment. Build the pipeline now, automate it later. The principle matters more than the SLA in the first phase.
Parental consent: harder than it looks
If your product can be used by people under 18, or if you cannot verify that your users are adults, you have a parental consent obligation. The Rules require verifiable consent from a parent or guardian before processing data of a minor, and prohibit behavioural profiling and tracking for users under 18.
The Rules mention DigiLocker explicitly as one mechanism for verifying parent identity. DigiLocker is the document wallet in India's DPI stack; its API allows identity assertion without storing the underlying documents. If parental consent applies to your product, a DigiLocker integration belongs on the roadmap.
For most B2B SaaS whose users are employed adults, this obligation is not the highest priority. The practical path for products without meaningful under-18 exposure: add an age attestation step at registration — a confirmed statement that the user is 18 or older. This does not fully satisfy verifiable parental consent for a user who misrepresents their age, but it establishes a reasonable basis for asserting adult status. Confirm the appropriate threshold with your legal team given your specific user population.
What to build in the first 90 days
The 18-month window is sufficient, but only if engineering starts now. Here is a priority order.
Next 30 days: verify that your at-rest encryption is actually in place, not merely assumed. Confirm backup encryption. Confirm TLS 1.2 or higher is enforced on all endpoints that receive personal data. These are checks, not builds, but gaps here are the easiest DPDP failures to avoid.
Next 60 days: build or upgrade the consent model to track purpose-specific records. Add a 'My Data' summary page, even if bare-bones. Write the breach notification runbook: who gets called, what the notifications say, where they go.
Next 6 months: build the erasure pipeline, including documented handling for backups. Implement access logging at the personal-data layer. Address parental consent if your product has any consumer-facing surface.
Before November 2026: assess whether you fall under significant data fiduciary criteria. High-volume processing, sensitive data categories, and national security relevance are the main factors. If you qualify, additional obligations apply: a Data Protection Officer and regular audits. Those tracks run parallel to the main compliance schedule.
The honest engineering assessment: these controls are not exotic. They reflect good-practice security hygiene that your product probably should have had already. The DPDP Rules make them mandatory for Indian personal data and give them enforcement teeth. If your security posture is already reasonable, compliance is mostly documentation, verification, and a few targeted engineering additions.
Start with the data map. The rest follows from it.
Frequently asked questions
Related reading
Bootstrapped or venture-backed: the Indian SaaS calculus in 2026
India hosts the second-largest SaaS ecosystem outside the US. The raise-or-bootstrap question has a different answer in 2026 than it did in 2021. Here's the data behind the shift.
Open-source licensing for engineers: a corporate codebase guide
Legal is not reviewing every npm install — you are. Here is the practical check to run before adding a dependency, and the licence type that catches most SaaS teams off guard.
DPDP Act for engineers: what you actually have to change in your code
Most DPDP coverage is written for legal teams. This piece maps the Act's obligations to concrete engineering work: consent tables, data rights endpoints, deletion flows, and breach notification infrastructure.