Recently, Indian officials summoned Wikipedia representatives due to misleading information published on a national cricketer’s page. This incident has sparked discussions surrounding content moderation, the power of government regulation over online content, and the responsibilities of intermediaries like Wikipedia.
Understanding Content Moderation
Content moderation refers to a process aiming at ensuring user-generated material agrees with specific guidelines and rules for suitability before it is published. It is significant in establishing a secure online environment devoid of misinformation or harmful content.
Wikipedia: An Overview
Founded in 2001, Wikipedia is a free internet encyclopedia that operates under an open-source management style. It is governed by the nonprofit Wikimedia Foundation and maintained by a community of volunteers who can make edits to existing pages or add new ones. Essentially, Wikipedia functions as an intermediary that hosts user-generated content.
Responsibility and Accountability for Content on Wikipedia
While intermediaries are generally granted immunity from user-generated content under most online content laws, they are expected to exhibit due diligence regarding their platforms. Despite the Wikimedia Foundation not owning the content hosted on Wikipedia, it can “contribute, monitor or delete content” to ensure legal compliance. Therefore, it could be held accountable for illegal content hosted on Wikipedia.
Governmental Power over Online Content: IT Acts and Rules
The Information Technology Act, also known as IT Act 2000, and its subsequent rules empower the government to regulate online content.
Section 69A of the IT Act authorizes Central and State governments to intercept, monitor, or decrypt information generated, transmitted, received, or stored in any computer resource. Through this provision, governments can direct any agency or intermediary to block public access to certain information. Notably, ‘intermediaries’ include various service providers, search engines, auction sites, online marketplaces, and cyber cafes.
Moreover, Section 79 of IT Act 2000 provides intermediaries with a “safe harbour,” which exempts them from being held responsible for the content they host if they abide by due diligence requirements. However, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 lists certain categories of information that should not be hosted on platforms, including any data that is patently false, untrue, or misleading.
The Case of Wikimedia Foundation
Although it does not own the information hosted, the Wikimedia Foundation can be held accountable under Indian law for hosting illegal content once it has “actual knowledge” of such content. The term ‘actual knowledge’ refers to when an intermediary is notified by either a court order or through an order of the appropriate agency requesting removal of offending content.
Understanding Cyber Security Incident Reporting in Indian Law
Section 70B of the IT Act mandates certain entities to report cyber security incidents. These include service providers, data centers, and corporate bodies. They are obligated to report cyber security incidences to the Indian Computer Emergency Response Team (CERT-In) within a reasonable timeframe after the incident has occurred. This mandate enhances the collective responsibility of all entities in maintaining cyber security and ensuring a safe online environment.