From Safe Harbour to Command‑and‑Control: How India’s 2026 IT Rules Draft Turns Intermediary Compliance Into Real‑Time Obedience
Introduction
On March 30, 2026, the Ministry of Electronics and Information Technology (“MeitY“) circulated draft Second Amendment Rules, 2026 (“Draft Amendment“) to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021(“IT Rules, 2021“), proposing the consequential structural realignment of India’s intermediary liability.
These amendments represent a fundamental recalibration of how India regulates digital intermediaries moving away from a rules-based compliance architecture toward a model where executive direction itself becomes the compliance standard. For platforms, publishers, and content creators, the shift carries material operational and legal consequences that demand immediate strategic response.
Page Contents
The Existing Regime: Rules-based Safe Harbour With Defined Obligations
Section 79 of the IT Act grants intermediaries1 – defined broadly to include platforms, cloud providers, messaging services, and any entity that stores or transmits third-party digital content, conditional immunity from liability for such content. This safe harbour is not absolute: it is preserved only where the intermediary observes “due diligence” and does not conspire, abet, aid, or induce any unlawful act in relation to the content it hosts. Failure to meet the due diligence standard, or failure to act on actual knowledge of unlawful content, causes the immunity to collapse exposing the intermediary to civil and criminal liability for content it does not itself create.
Rule 3 of the IT Rules, 2021 operationalises the due diligence obligation. It requires intermediaries to publish terms of service prohibiting enumerated categories of harmful content, establish accessible grievance redressal mechanisms, appoint designated compliance officers, and for Significant Social Media Intermediaries2 (platforms with over fifty lakh registered users) must undertake additional obligations including proactive content moderation and technical measures enabling traceability of message originators.
Critically, prior to the Draft Amendment, this due diligence framework was statute bounded. Government advisories, SOPs, and informal directions issued with increasing frequency on subjects ranging from AI-generated deepfake to electoral content, operated in a legally ambiguous parallel space. Platforms that complied did so on a risk-management basis, not because any provision explicitly made such compliance a condition of safe harbour. The Draft Amendment closes that gap.
What The Draft Amendment Actually Proposed To Change
1. Executive Direction Becomes Due Diligence Obligation
The proposed Rule 3(4) requires intermediaries to comply with “clarifications, advisories, orders, directions, standard operating procedures, codes of practice, or guidelines” issued by the Central Government in writing relating to implementation of due diligence requirements. Critically, failure to comply jeopardises safe harbour protection. The Draft Amendment requires that every such direction be issued in writing, specify its legal basis, define its scope and applicability, and remain consistent with the IT Act and IT Rules, 2021. These procedural requirements create a paper trail and anchor directions to legal authority, but they do not limit the substantive range of subjects on which directions may be issued, nor do they establish any independent adjudicatory check before a direction acquires binding force.
This is a material departure from comparable global frameworks. The European Union’s Digital Services Act3 (“DSA”), imposes layered obligations on very large online platforms but retains procedural separation between compliance obligations and liability exposure. India’s proposed model is more direct, the safe harbour becomes a privilege conditionally granted upon real-time compliance with executive directions, not a general protection available to compliant platforms.
Operational implications: This transforms the compliance model from static to dynamic. Previously, intermediaries operated within a published normative framework. MeitY directions, if issued, were advisory or policy-driven; non-compliance carried reputational or political consequence, not safe harbour loss. Now, directions are binding real-time compliance obligations. Platforms must establish continuous monitoring systems for MeitY communications, rapid legal assessment protocols for each direction’s validity, and documented implementation procedures. A direction issued Monday with a Friday implementation deadline becomes a mandatory due diligence obligation by Tuesday.
2. Clarification of Data Retention as Cumulative Baseline
Rule 3(1)(g) and (h) previously required intermediaries to preserve removed content and user information for 180 days. The Draft Amendment clarifies that this period operates “without prejudice to any other requirement under the Act or any other law.” This language is deceptively simple but legally consequential.
The clarification establishes that IT Rules 2021 retention obligations are cumulative, not substitutive. Intermediaries cannot argue that IT Rules compliance satisfies their retention duties across all regimes. They simultaneously remain subject to retention mandates embedded in tax legislation, telecommunications regulations, sectoral rules, and the Digital Personal Data Protection Act, 2023 (“DPDPA”).
Operational implications: Platforms must now map retention obligations across multiple legal regimes, develop layered retention schedules for different data categories, and justify extended retention under conflicting principles. The DPDPA mandates data minimisation and deletion once purpose is served; the IT Rules 2021 mandate investigative retention without temporal bound. A platform must simultaneously delete personal data under DPDPA and retain it under IT Rules 2021 – a legal contradiction that creates litigation exposure and requires carefully documented exception frameworks. Organizations without mature data governance infrastructure face heightened compliance risk and potential enforcement action by both the Data Protection Board and government agencies.
3. Part III of the IT Rules 2021 Expands to User-Generated News Content Across All Intermediaries
Part III of the IT Rules 2021 previously applied only to publishers of news and current affairs content4 and publishers of online curated content5. The Draft Amendment extends Part III’s blocking regime to any intermediary hosting user-generated news or current affairs content. This includes social media platforms, influencer networks, and content aggregators. The Inter-Departmental Committee (“IDC”) gains power to summon intermediaries and issue emergency blocking orders for such content.
Operational implications: This amendment has cascading consequences for content creators. An influencer operating a newsletter discussing political developments, a journalist posting analysis on social media, or a content creator running a podcast on current affairs may now fall within Part III’s regulatory ambit. These creators face potential IDC referrals, takedown orders, and blocking exposure previously limited to formal publishers. Platforms hosting such content assume quasi-publisher responsibility – they must identify and classify user-generated news, manage content moderation, and respond to IDC directions. Smaller platforms and creators lack resources for such compliance, creating barriers to entry and potential chilling effects on speech.
Role of Key Actors: Evolving Responsibilities
Intermediaries: No longer operators within a defined rules-based perimeter, but subjects of real-time executive compliance obligations. They must simultaneously manage platform policies, implement executive directions, navigate conflicting retention regimes, and assume quasi-publisher responsibility for user-generated content.
MeitY: Shifts from rule-maker to real-time director. The ministry becomes the primary compliance standard, its directions, advisories, and guidelines are enforceable due diligence obligations. This concentrates regulatory power but also creates accountability: directions must be legally grounded and consistent with the IT Act, making them vulnerable to constitutional challenge.
Content Creators and Influencers: Transition from largely unregulated space to Part III’s regulatory ambit if they publish news or current affairs. They face blocking exposure and IDC oversight without the institutional resources of traditional publishers.
Inter-Departmental Committee: Expands from complaints-handler to discretionary investigative body. The committee can now examine any “matter” referred by government, not just formal complaints under Part III. This broadens its reach exponentially.
Conclusion
The Draft Amendment does more than update the IT Rules, it changes the basic deal between the State and the intermediary. Section 79 safe harbour, which started life as a rules-based protection tied to publish due-diligence standards, is now reimagined as a privilege that has to be constantly earned through compliance with whatever directions the executives choose to issue in real time.
On the surface, Rule 3(4) looks procedural (written directions, stated legal basis, consistency with the Act), however, in practice, it dissolves the line between advisory practice and binding law. Clarifications, SOPs and guidelines that intermediary could once treat as policy signals will, if the draft stands, become the very conditions on which their immunity rests. That is a deliberate move away from legislation-led governance and towards administration-led governance and it is where most of the constitutional argument is likely to converge.
The other changes pull in the same direction. Treating the 180-day retention rule as a floor rather than a ceiling hardens the tension with data-minimisation and storage-limitation norms under the DPDPA. Extending Part III to user-generated news content brings a vast tier of creators and small platforms into a blocking and oversight framework built for professional publisher with obvious implications for cost of compliance and willingness to speak.
For companies, the practical takeaway is simple. If they want to continue to enjoy safe harbour in India, they will need to build the muscle for direction driven compliance, teams that can read and challenge instructions quickly, processes that can show what was done and why and product choices that anticipate stricter expectations around retention and content control.
The consultation phase is therefore not merely a box-ticking opportunity. It is likely the last point at which industry, civil society and constitutional experts can push for sharper guardrails on executive discretion. Whether India ends up with a principled digital due diligence framework or rolling instruction manual for online speech will depend on how aggressively the stakeholders contest and shape it now rather than argue for later in court.
****
Authored by Srishti Rathore, Associate and contributed by Sindhuja Kashyap, Partner, King Stubb and Kasiva
This article is intended as a legal analysis for informational purposes only and does not constitute legal advice. Analysis is based on the draft Second Amendment Rules as circulated in March 2026 and may require revision upon final notification.


