Commentary
(Graphics of X - Elon Musk)
The last month has witnessed something incredible in India’s digital governance landscape.
X Corp. (formerly Twitter) challenged the Union Government’s use of Section 79(3)(b) of the Information Act of 2000 (‘the Act’) to block content it deems harmful. It has also sought protection from joining the Ministry Of Home Affairs Sahyog portal terming it as a ‘censorship portal’.
Although X Corp. has previously resorted to legal manoeuvres to stall compliance with Indian law, this marks the first occasion since Elon Musk’s takeover that the company has taken an openly confrontational stance against the government.
While the change in X’s approach was somewhat likely, given the shift in geopolitical climate post-US presidential elections, what’s been striking is the assertiveness with which X has rebuked the Indian central government. It has even gone so far as to insinuate that the government has been actively indulging in censorship activities.
Before delving into the merits of the charges made by X, it is important to understand what triggered the latest round of confrontation between the Centre and X.
X’s decision to approach the Courts stems from two very distinct factors.
Firstly, since the election of Donald Trump to the Oval Office, Silicon Valley-based tech companies have intensified a coordinated pushback against any form of content moderation norms adopted across the globe. These companies have started to conflate harmful, hateful, and potentially illegal speech with political free speech to justify their noncompliance with the norms.
Musk-owned X, perhaps more than any other social media giant, wants to cultivate its image as a global free speech crusader by importing America’s absolutist idea of free speech to other countries. This policy approach of X has also led to a bitter standoff with the European regulators, who have one of the world’s strictest content moderation norms.
Secondly, X’s growing inability to filter out harmful and illegal content after the Musk takeover has made it more vulnerable to legal scrutiny. A recent study by researchers at the University Of California, Berkeley, has revealed that eight months into the Musk takeover, X has witnessed a 50% increase in overt hate speech.
After Musk officially took over X in November 2022, he systematically dismantled the company’s content moderation teams that worked to shield users from hateful and abusive content.
The removal of guardrails in the name of protecting free speech has impacted other features and services being offered by X, including Grok, the AI chatbot developed by Elon Musk’s artificial intelligence company, xAI. Grok’s latest iteration, Grok-3 has recently come under scrutiny from the Ministry of Electronics and Information Technology over its profane, vulgar, and abusive responses to user-generated prompts.
Legal Contentions Raised By X
X’s primary contention remains that the Central Government, to bypass the statutory safeguard under Section 69A, is issuing a ‘blocking order’ under Section 79(3)(b) of the Act. Such a move, according to X, creates a parallel system of censorship without any oversight.
While the legalities of the argument merit consideration, the preliminary question remains whether X, a foreign-headquartered company, can approach the High Court claiming infringement of rights of users on its platform.
This question remains of vital importance, as X enjoys protections offered to intermediaries under Section 79 of the IT Act. The IT Act defines ‘intermediary’ as entities that ‘receive, store or transmit that record or provides any service with respect to that record’.
An intermediary merely being a host cannot contest the legality of content posted over its platform, and it certainly cannot approach the Courts on behalf of those who post illegal content over its platform. Even X, in a case before the Delhi High Court, had admitted that ‘an intermediary cannot decide whether content on its platform is lawful or otherwise unless it is put to such’.
It is also a settled principle that when an intermediary initiates transmission, selects the consumer of transmission or modifies information embedded in the transmission, it assumes the role akin to an editor. While the editor does enjoy the right to approach courts if his speech is censured, he cannot claim blanket protection from the civil or criminal liabilities resulting from the published content.
The fundamental difference between the roles of an intermediary and an editor is what makes X’s stance before the Karnataka High Court paradoxical. On one hand, it takes on the shield of safe harbour provision to protect itself from criminal charges arising out of the circulation of scores of illegal and harmful content on its platform, while on the other, it assumes the role of an editor to seek the protection of speech for user-generated content.
On the question of the legality of the use of Section 79(3)(b) of the IT Act, X Corp. has deliberately conflated the ‘blocking order’ issued under Section 69A with ‘takedown notices’ issued under Section 79(3)(b) of the IT Act.
While the blocking order can only be issued on the limited grounds provided under sub-section (1) of Section 69A, takedown notices can be issued by authorised agents when intermediaries fail to perform their due diligence obligations under the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. By masking ‘takedown notices’ as ‘blocking orders’, X aims to escape its due diligence obligation under IT Rules, 2021.
The case before the Karnataka High Court is a sad reminder of how these foreign companies are using our cherished constitutional freedoms as a smokescreen to refuse compliance with domestic law.
Their refusal to comply not only challenges the sovereign function of the State but also puts millions of citizens in harm's way by exposing them to harmful content.