The Information Technology Amendment Rules, 2023 (ITAR 2023), have recently been the topic of discussion in the Indian judiciary. The Bombay High Court has made an observation that suggests these rules do not provide any active protection for government criticism through parody or satire. The rules are part of the broader legal framework under the Information Technology Act, 2000, which acknowledges and governs electronic commerce in India.
What are Information Technology Amendment Rules, 2023?
ITAR 2023 has certain obligations for intermediaries which include platforms that host online games, social media firms like Facebook, YouTube, and Twitter, and internet service providers such as Airtel, Jio and Vodafone Idea. The intermediaries must ensure they do not host content related to the Central Government marked as “fake or misleading” by a fact-check unit that the IT Ministry may notify. Additionally, the hosting platforms should not permit harmful unauthorized online games and their advertisements or circulate false information regarding the Indian government.
Significance of Self-Regulatory Bodies
The new rules mandate gaming platforms to register with a Self-Regulatory Body (SRB) which will determine whether the game is permissible. The platforms must also guarantee that the online games being offered do not involve any gambling or betting elements and adhere to all legal norms, standards, and safety measures such as parental controls.
The Concept of Losing Safe Harbour
Under the new IT rules, if the designated fact-check unit marks a piece of information as fake, intermediaries will have to remove it. Failure to do so would result in them losing their ‘safe harbour’ – a legal provision that shields them from lawsuits against third-party content. This means, social media sites will be required to take down offending posts, and internet service providers will have to block URLs of such content.
Key IT Rules, 2021 for Social Media Platforms
IT Rules, 2021 emphasize greater diligence from social media platforms regarding content on their platforms. Intermediaries must act within 24 hours of receiving complaints about content that infringes upon the safety and dignity of individuals. They must also educate users about privacy policies, including the consequences of circulating copyrighted material or anything defamatory, racially or ethnically objectionable, paedophilic, threatening the unity, integrity, defence, security or sovereignty of India, or violating any current law.
Concerns Over the Amendment
Major concerns about the amendments include a lack of clear definition of ‘fake news’, leaving this decision in the hands of the government’s fact-check unit. The use of undefined terms, especially the phrase “any business”, allegedly gives the government open-ended power to decide what people can view, hear, and read on the internet. The rules also do not provide explicit guidelines about what qualifies as false or misleading information or the qualifications and procedures for the fact-check unit. A significant concern is the regulation’s potential violation of the Supreme Court’s judgment in Shreya Singhal vs Union of India (2015), which states that a law limiting speech cannot be vague or over-broad.
Proposed Way Forward
In response to these concerns, it’s suggested that the government and intermediaries could leverage technology like algorithms and fact-checking websites to combat misinformation and fake news. Additionally, intermediaries could deploy self-regulatory measures such as monitoring content and collaborating with fact-checking websites. Public awareness about the risks of censorship and the promotion of free speech can be facilitated through social media campaigns, workshops, and public discussions.