Home | Feedback | Contact Us
Legal Articles  


Blacklisting Blacklists The Information Technology Amendment Act, 2008 and ISP driven content filtering

The changeover in India from the Information Technology Act, 2000 to the Information Technology Act as amended by the Information Technology Act, 2008 throws up some interesting issues on ISP liability and ISP-driven content filtering. The entire tenor of safe harbour provisions for ISP’s has been
changed by the amendments. Safe harbour provisions are the provisions of a statute that limit a party’s liability under the statute on the condition that the party’s acts were in good faith. Post amendment, ISP’s in India no longer have an obligation of preventive due diligence with regard to illegal content made available by them. In fact, the safe harbour provisions are so stringent that even use of voluntary measures of filtering objectionable content such as using blacklists could cause an ISP to lose safe harbour protection. The resulting legal position that makes pro-activity in the prevention of crime detrimental to the ISP itself is untenable writes Devika Singh.

Introduction

The winter of 2004 saw Avnish Bajaj, chief executive officer of Baazee.com, the Indian subsidiary of US based auction portal EBay, spending his days in Delhi’s Tihar prison. His liability stemmed from the fact that his website was allegedly selling a video CD containing obscene content prohibited under anti-pornography laws in India. He was therefore arrested under section 67 of India's Information Technology Act, which relates to transmission of obscene material through electronic media. Avnish Bajaj had probably never even known about this, let alone seen the obscene material itself. Yet, he was a man marked for crimes not committed by him and was incarcerated along with seventy other under trials accused of crimes ranging from pick-pocketing to rape and murder. In the aftermath of this incident, it became evident that the Indian law, when determining the liability of internet service providers did not conform to international jurisprudential trends and its application could lead to many unintended consequences. The world is divided in horizontals and verticals on the issue of internet service provider (“ISP”) liability. Certain countries believe in a horizontal approach to ISP liability i.e., one liability regime applicable to any infringement regardless of the area of law, be it copyright, defamation or privacy rights. Others go by the vertical approach where liability limitations for ISP’s are defined according to the area of law in which the offence occurs.

As it stands, in part as a reaction to Avnish Bajaj’s persecution, India shall now embrace the horizontal approach followed by the European Union Council Directive 2000/31/EC on certain legal aspects of electronic commerce in the Internal Market ("EU Directive"). The Information Technology Act, 2000 (“IT Act”) has been redrafted by the Information Technology Amendment Act, 2008 (“Amendment Act”) which received presidential assent on February 5, 2009. This changeover throws up some interesting issues on ISP liability as the entire tenor of safe harbour provisions for ISP’s has changed with the Amendment Act.

Through the Amendment Act, the law has turned away from the precautionary or preventive approach expected from ISPs in terms of “preventive due diligence” under the IT Act. Preventive due diligence has no place under the Amendment Act, and in this author’s opinion, an exercise of preventive due diligence may in fact lead to a loss of immunity available to ISPs. This paper expounds this issue and related dilemmas in the context of ISP driven content filtering as the law on ISP liability is changed from the IT Act to the Amendment Act.

ISPs and user generated content

Since its inception, people have been able to use the internet to communicate and interact with other users. Early internet technology focused on the use of emails, internet messenger applications, news groups, web forums and internet relay chat. This was followed by consumption of content on the internet through directories and search engines. In addition to the use of specific communication applications, several forms of ISPs became involved in the online delivery of content (whether data, documents or audio-visual works) to users. Thus, users also began to use static content such as blogs and social networking sites including MySpace, Facebook etc. to combine communication with other users and produce their own “user generated content.”

As more and more users got on to the internet bandwagon, increased interactions lead to increased conflict. Increased availability of user generated web content, particularly content of an objectionable, illegal, or offensive nature, demanded measures of content management.

In response, there was an emergence of measures by industry, government and internet users to mitigate online risks through educational initiatives, legislative actions and technological solutions. Network service providers or ISPs were identified as the optimal gatekeepers for user generated content and thus followed the stress on ISP liability for user generated content. However, as the commercial and legal ramifications of this bias were understood, legislature started introducing safeguards for ISPs – conditions for immunity popularly known as “safe harbour” provisions.

Safe harbour provisions are the provisions of a statute that limit a party’s liability under the statute on the condition that the party’s acts were in good faith. Thus, safe harbour provisions protect legitimate or excusable violations. In the context of legislations governing the internet, safe harbour provisions are generally available to ISP’s to limit their liability for third party acts on the internet. In India too, a safe harbour has been provided to ISPs for user generated content under the IT Act. Safe harbours in the IT Act

Under the IT Act, a safe harbour is available for ISPs under section 79. Section 79 hitherto exempted network service providers from liability for any third party information or data made available by them if the network service provider could prove that the offence or contravention was committed -
  • Without his knowledge, or

  • He had exercised all due diligence to prevent the commission of such offence or contravention.
ISPs could therefore gain access to a safe harbour by satisfying the twin requirements of lack of knowledge (actual or constructive) and preventive due diligence. Preventive due diligence is understood to be an ISP’s duty of care to investigate whether the material it hosts or refers to is unlawful and, where found to be so, to block access to the same. Practically, preventive due diligence translates as documented due diligence which includes compliance with the IT Act, related rules, notifications etc. and the deployment of technological measures of prevention.

Technological measures of prevention include content filtering. Content filters identify, select and block inappropriate web content based on lists of inappropriate keywords and phrases (“Blacklists”). Methods of compiling blacklists vary between vendors with some using advances software analysis techniques to scan the internet for inappropriate or illegal material and others relying in all instances on human assessors. Given the dynamic nature of much internet content, blacklists must be constantly reviewed and updated. For the purposes of most blacklists, inappropriate content predominantly covers sexually explicit content, profanity and racist content.

Blacklist filters can be implemented at the ISP level, whereby the filter software is installed on a server within the ISP’s network and the filtering occurs upstream of the user. Blacklist filters have been recognized to be a technically feasible method to monitor and prevent illegal content in the backdrop of previously identified illegal content on an ISP’s network.

In fact, the world over, ISPs have come under increasing pressure from users and legislators (in the UK, Australia and Italy amongst other countries) with regard to blocking illegal and offensive content. In some countries such as China, ISPs are obligated by law to monitor and control transmitted content. In India too, the IT Act requires the assurance of an ISP to exercise due diligence in preventing the commission of an offence or contravention. Blacklist filters have therefore become popular with ISPs as a tool of documented due diligence. However, all this seems set to change with the changes introduced to the IT Act by the Amendment Act.

Changes made by the IT (Amendment) Act, 2008

Under the Amendment Act, the amendments to Section 79 of the IT Act state that an ISP shall not be liable under any law for the time being in force, for any third party information, data or communication link made available by him, except when,
  • the intermediary has conspired or abetted in the commission of the unlawful act; or

  • upon receiving actual knowledge, or on being notified by the appropriate Government or its agency that any information, data or communication link residing in or connected to a computer resource controlled by the intermediary is being used to commit the unlawful act, the intermediary fails to expeditiously remove or disable access to that material on that resource without vitiating the evidence in any manner.
Therefore, the twin requirements under the IT Act have been remolded. The requirement of knowledge has now been expressly changed to receipt of actual knowledge. Actual knowledge here may mean the receipt of information from a third party, but not necessarily from own inquiry upon the content of the information. This has been combined with a “notice and take down” duty. Preventive due diligence has been done away with and the ISP is only required to prove that it did not conspire or abet the commission of the unlawful act. These changes seem advantageous to ISPs as they set more lenient parameters for qualifying for safe harbour. Fallout however, is their profound impact on ISP driven content filtering in India.

Content filtering and the amendment act

Apart from requiring satisfaction of the new twin requirements as explained above, the Amendment Act also concentrates on the functions and services of an intermediary when providing safe harbour. Under the Amendment Act, safe harbour is only available to an ISP in the following circumstances:
  • where the function of the intermediary is limited to giving access to a communication network over which information made available by third parties is transmitted or temporarily stored or

  • where the intermediary (i) does not initiate the transmission, (ii) does not select the receiver of the transmission, and (iii) does not select or modify the information contained in the transmission.
Borrowed from Article 12 of the EU Directive, these conditions have been explained in the EU Directive to mean that the ISP should be providing services of a passive or automatic nature hence acting as a mere conduit to the information but not serving any editorial or controlling function in its creation or dissemination. The EU Directive stresses particularly on the absence of modification of the information contained in the transmission as a condition precedent to availing the safe harbour. It clarifies however, that this requirement does not cover manipulations of a technical nature as they do not alter the integrity of the information contained in the transmission. The introduction to the EU Directive also explains that these provisions should not preclude the development and effective operation by different interested parties of technical systems of protection and identification. Arguably, blacklist filtering is not precluded as a technical system of protection.

Unfortunately, such clarifications have been skipped by the Amendment Act. The practical effect of this legislative silence is that an Indian ISP is no longer required to take any preventive steps with regard to user generated content which is objectionable or illegal even when it is in a position to deploy technically feasible preventive measures such as blacklist filtering.

Going one step ahead, to avail safe harbour provisions, the Amendment Act requires that an intermediary must not select or modify information contained in a transmission. Since modification by technical means has not been addressed and content filtering would amount to selecting and modifying information contained in a transmission, this suggests that even voluntary measures of self regulation adopted by ISPs to prevent illegal or objectionable content on their network may run averse to the law and cause ISPs to lose out on availing safe harbour provisions under the Amendment Act.

The changes to Section 79 are a disincentive to ISP pro-activity. To keep in line with the Amendment Act, an Indian ISP will have to deactivate any filtering systems it has in place. It may be aware of objectionable content on its network or website including infringing content but till actual knowledge is received, it will have to remain oblivious to such illegal or objectionable content and actually look the other way as it cannot possibly take the risk of selecting or modifying content. This extreme position is hardly desirable in a commercial and legal context and must be reconsidered. Although ISPs should not be required to monitor their services, they should not be discouraged from doing so. In certain instances, due to their technological proximity to their users conduct, the ISP may be the party best situated to detect and eliminate the illegal content.

Conclusion

Blacklisting blacklists cannot be the way forward. It is laudable that the Amendment Act attempts to protect intermediaries like Avnish Bajaj so comprehensively by emphasizing control as a turning point for its liability. However, the one sidedness of these provisions overlooks the clarifications contained in the EU Directive and the practical translation of this situation to the detriment of other stakeholders. This is glaringly unfair when it is technically feasible on the part of the ISP to prevent the posting of certain user generated content which is illegal and when, in its discretion, the ISP wishes to be a good samaritan. It is farcical to assume a legal position that makes pro-activity in the prevention of crime illegal. A different path balancing prevention and cure needs to be found.

DEVIKA SINGH is a Senior Associate in the New Delhi office of Kochhar & Co., one of the leading and largest corporate law firms in India. Devika’s experience includes Major Projects (Infrastructure), Corporate/M&A's, Employment and Labour, Information Technology and Dispute Resolution.
 
© 2007 India Law Journal   Permission and Rights | Disclaimer