In Focus by The Hindu cover image

In Focus by The Hindu

What are the factors at play in content moderation

Apr 17, 2025
45:04

Every day, millions of posts are made online — tweets, videos, memes, reels. Some content is violent, misleading, or even dangerous. 

This is where content moderation comes in. However, deciding what stays up and what comes down isn't as simple as it sounds.  

In fact, X has sued the Union government in the Karnataka High Court for the SAHYOG portal, which it says is a “censorship portal” that allows local police and different parts of the government to demand takedowns. The Karnataka High Court did not grant interim relief to X after the Centre informed the court that there was no reason for the social media platform to be apprehensive of any coercive action against it. The matter will be taken up on April 24. 

Taking down content is actually quite normal in India. In 2024, the govt blocked a 28,000 URLs across various social media platforms. These URLs had content linked to pro-Khalistan separatist movements, hate speech, and material that are considered to be la threat to national security and public order. 

A recent report in The Hindu says that nearly a third of the 66 takedown notices sent to X by the Ministry of Home Affairs’ Indian Cyber Crime Coordination Centre (I4C) over the past year warn the platform to remove content about Union Ministers and Central government agencies. 

This included content about PM Narendra Modi, Home Minister Amit Shah and his son Jay Shah, and Finance Minister Nirmala Sitharaman.  

Globally, too, platforms have come under criticism for content moderation, or the lack of it. Facebook’s role in amplifying hate speech during the Rohingya crisis in Myanmar is one such example. In the U.S., Twitter’s internal communications — revealed in the so-called “Twitter Files” — sparked a debate about political bias and backchannel moderation. Instagram users have repeatedly flagged the increase of graphic content.  

Countries are responding to this challenge in very different ways. The European Union is pushing for algorithmic transparency and accountability with its Digital Services Act. The U.S. had taken a hands off approach despite several controversies. In India, the government and law enforcement agencies flag content to be taken down. 

So, who gets to decide what free speech looks like in the digital age? Is it the government, the platform themselves, or the public? And how do we draw the line between harmful content and healthy debate?


Guest:  Dr. Sangeeta Mahapatra, Research Fellow at the German Institute for Global and Area Studies

Host: Nivedita V

Edited by Sharmada Venkatasubramanian.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner