This article discusses the amendments made in 2018 in the Information Technology (Intermediaries Guidelines) Rules, 2011. There have been constant concerns raised in the Apex Court regarding trolling, character assassination, fake news, privacy invasion in the name of public order and Freedom of Speech and Expression entrusted to the citizens and intermediaries by the Constitution. To control this misuse of social media platforms, the government initiated this amendment. This article also talks about why and how intermediaries should be made more liable for the content they allow to be published and transmitted. It talks about what kind of content is prohibited and how the liability is created. The paper discusses how these amendments in the Media law help the Judiciary act as a catalyst in resolving social media issues by means of progressive amendments. What were the flaws in the previous Act, is also talked about. Concepts such as self harbor protection and chilling effect are discussed. What role the government plays and how is the implementation going to take place needs to be the responsibility of the authorized body. It discusses how law plays a vital role in bringing the necessary changes to curb the evils of society. Lastly, it talks about how this amendment is affecting societies, intermediaries and people in the legal profession and also the lacunas that are present in the present amendment as well.
Liabilities regarding Social Media
It is widely evident that an increase in participation of people on the Internet and on other social media platforms has brought troubles along with advancement and ease of communication. Anything that comes with ease and accessibility opens gates for predators to misuse the same for the purpose that isn’t morally, ethically or legally correct. Acts such as character assassinating someone publically on social media platforms such as Facebook Instagram etc. amounts to a violation of one’s rights and is also legally forbidden. The question which arises here is that how and who to be made liable for such acts committed by a number of people on a daily basis intentionally or unintentionally. Are the individuals to be made liable solemnly or is the host through which such acts took place also holds some responsibility. These hosts are the intermediaries who provide services enabling the users to display their content online sharing it with the world. Section 2(w) of the Information Technology Act, 2000 defines the term intermediaries. [1] Recently there have been certain voices that have been raised before the Apex Court regarding the same issue.  A bench of Justices Ashok Bhushan and KM Joseph said that it is the matter of trial as to whether Google LLC (parent company) or Google India Pvt. Ltd was the actual intermediary which hosted the alleged objectionable content against an asbestos sheet maker company. “We hold that Section 79 of the Act, prior to its substitution, did not protect an intermediary in regard to the offense under Section 499/500 (defamation) of the IPC,” the bench said. [2] This case is a landmark case that sets a new example before the orthodox practice which was being continued previously. 
The Information Technology Act, 2009, omitted the liability of intermediaries for the wrong acts that were easily happening on their platforms. It did not have sections that could punish the intermediaries for causing social abuse which is why there was a dire need to bring this amendment.  The amended Act now puts liability on the intermediaries to observe due diligence and ensure certain things such as- that they publish rules and regulations, privacy policy and user agreement for access, and such rules will inform the user to not host, publish, upload, display, modify or share any information that is harmful, harassing, derogatory,  obscene, invasive of other’s privacy, hateful, ethically objectionable or otherwise unlawful in any manner, which is mentioned in section 3(2) of the amended Intermediaries Act.[3] Rule 3 (9) of the Amendment requires intermediaries to deploy technology-based automated tools or appropriate mechanisms, with appropriate controls, for proactively identifying and removing or disabling public access to unlawful information or content.[4]  
There has been a shoot up in a number of cases of mob violence and chaos due to the spread of fake news and lack of credibility by authorized sources. The problem that still needs a fixture and a certain way of implementation is that how will the host companies or the intermediaries keep a check at the information being posted or spread when the volume of that information keeps reciprocating and exponentially increasing every minute. So, for the intermediaries to actually work pro-actively is a little unfair and impractical. Censoring the content would require modified rules and checks at all points which would require the intermediaries to invest more time and money. This can have a negative impact on small entrepreneurs who aren’t stable and financially fit to take up this task. By imposing additional costs, it will again lead to unhealthy competition and profits for the well-established entrepreneurs only. Another issue that would arise would be excessive private censorship which could be done to grab attention. Ambiguity and vagueness of certain definitions as to what is ‘unlawful content’ still differ for every intermediary which will further result in confusion because it is subjective in nature. It will lead to broad and expanded interpretations. For eg: newsworthy information could be stopped from publishing due to differences of opinion if considered as hate speech. Many such problems are probable to happen. These are some lacunas that still need consideration and solution.  
It will be helpful and should be encouraged that these intermediaries produce reports that would give details about the agencies that are taking down information, requesting to do so or giving a brief regarding what is unlawful in the posted content. Political criticism and dissenting to what is wrong is an essential part of every Democracy. This right should not be grabbed away from an individual in the wrong name of accusations of unlawful content and shouldn’t suppress them to raise their voices. Hence, this amendment in the Information Technology Act did make a difference and a ray of hope to improve the current situations but still has room for improvement. 
[1] Software Freedom Law Centre, Intermediaries, users, and the Law: Analyzing intermediary
liabilities and the IT Rules;
[2] Google India Private Ltd. Vs. M.S. Visakha Industries, Criminal Petition No.7207 of 2009;
[3] The Information Technology (Intermediaries Guidelines (Amendment) Rules), 2018;
[4] Comments on Draft Intermediary Guidelines Rules,2018, Published by Ministry of Electronics and IT Government of India, 08-02-2019;

Leave a Comment

Your email address will not be published. Required fields are marked *