Art & Culture Science & Tech

Collab of Social Media Regulation with Free Speech

Facebook, one of the major social media platforms, has established the Oversight Board, an impartial organisation that examines its “content moderation” procedures.

The IT rules of 2021

  • Regulating social media intermediaries (SMIs): Governments from all around the world are debating how to regulate social media intermediaries (SMIs).
  • Addressing the concerns about SMI limiting free speech: It is critical for governments to update their regulatory framework in order to deal with new challenges given the complexity of the issue, the significance of SMIs in influencing public discourse, the impact of their governance on the freedom of speech and expression, the volume of information they host, and the ongoing technological advancements that affect it.
  • In order to address these difficulties, India revised its ten-year-old regulations on SMIs in 2021 with the IT Rules, 2021, which were largely designed to impose requirements on SMIs in order to guarantee an open, safe, and reliable internet.

Recent amendments?

  • The stated goals of the proposed revisions, which were released as a draught in June 2022, were triangular.
  • safeguarding constitutional rights: It was necessary to ensure that major IT companies did not violate the rights and interests of internet users under the Constitution.
  • Grievance resolution To make the Rules’ framework for resolving complaints stronger,
  • In order to avoid dominance: Early-stage Indian start-ups shouldn’t be impacted by compliance with these.
  • This resulted in a series of changes that may be broadly divided into two groups.
  • SMI has additional obligations: The first category includes adding more requirements to the SMIs to assure stronger user interest protection.
  • Appeal procedure: The establishment of an appellate procedure for grievance redressal fell under the second category.

Social media a double-edged sword

  • Platforms that handle social media often moderate user content on their websites. Users that break the terms and conditions of their platforms have their accounts removed, given higher priority, or suspended.
  • Government has too much power: The government’s current restrictions on internet speech are untenable in today’s online environment. There are currently millions of users of social media. Platforms have democratized public participation, and shape public discourse.
  • Hate speech on the internet: As the Internet’s usage grows, so too do its potential drawbacks. Online today, there is more hazardous and unlawful content.
  • Disinformation campaigns: Recent examples include hate speech against the Rohingya in Myanmar and on social media during COVID19.

Compromise between regulation and free speech

  • Governmental directives must be followed: In addition to being necessary and appropriate, government orders to remove content must also adhere to due process.
  • The most recent Digital Services Act (DSA) of the European Union (EU) serves as a suitable example. In the EU, intermediary liability is governed by the DSA. Government takedown orders must be reasonable and in proportion.
  • Platforms should have the option to fight the government’s order: The DSA allows intermediaries the chance to defend themselves and contest the government’s decision to censor information. These procedures will firmly protect online users’ right to free speech. The most significant thing is that an intermediary law must devolve important platform-level content filtering decisions.
  • A co-regulation concept: Platforms must be responsible for controlling content in accordance with general government regulations. Implementing such a coregulatory framework will accomplish three goals.
  • The terms of services will continue to be reasonably controlled by platforms: Coregulation will enable them the freedom to establish the changing criteria for harmful content, doing away with the necessity for stringent government regulations. Because private censorship is encouraged by government control, this will advance free expression online. The consequence of private censorship is to suppress user discourse.
  • Platforms are subject to the rule of law: As content censors, platforms have a significant amount of power over users’ freedom of speech. Platforms must adhere to due process and make proportionate judgements whenever they remove content or address user complaints. In order to properly resolve customer complaints, they must include procedures including notice, hearings, and reasoned orders.
  • Algorithmic transparency: Algorithmic openness can boost platform responsibility.


The GACs need to be reviewed because they give the government more control over censorship. The statute that would replace the IT Act is anticipated to be a Digital India Act. This is the ideal time for the government to implement a coregulatory strategy for controlling online speech.

And get notified everytime we publish a new blog post.