Renata Uitz of Central European University welcomes Rob Faris, research director of the Berkman Center and the OpenNet Initiative.
“A bunch of smart people invented the internet,” says Faris, highlighting the wonderful ways in which the Internet brought millions of people together. “People began using the Internet for various other things too – porn, making fun of religion and national leaders…On the Internet we have the good, the ugly, and the illegal, and a whole lot of it.”
Faris asks us to pause and think about how we might draw a line between that which is offensive or very offensive and that which is illegal.
“Governments then caught on and started to think about how to rein in activity on the Internet,” says Faris. “The US Congress tried to block content on the Internet,” but the Supreme Court repeatedly blocked it; thus, the only large-scale filtering in the United States exists in schools and libraries. Elsewhere, Internet filtering is pervasive: Saudi Arabia held back on wider deployment of the Internet until they could install wide-scale filtering. Iran was an early adopter, China also filters heavily. Singapore, on the other hand, set up an aggressive filter but then decided to block only a symbolic group of sites.
ONI conducted its first survey of Internet filtering in 2007 and found more than 30 countries were filtering the Internet. Faris then takes us through various stages of filtering:
- First generation of controls: simple filtering.
- The second generation of Internet filtering involved putting laws in place: Regulation, requiring bloggers to register to use the Internet, and other regulation. Other approaches: takedowns, threats of bloggers and Internet users. A different approach: cyber attacks against websites and hosts.
- Third generation controls often exist outside of government; intermediary censorship, or state-sponsored propaganda pushers like the 50 Cent Army and a Russian team of bloggers.
Countries continue to expand and refine their filtering strategies: We’ve recently seen increases in filtering around elections and other events, such in Bahrain and Kyrgyzstan. Filtering of search results is a new phenomenon as well and exists in China, as well as Argentina. This second generation of controls allows more flexible controls.
Intermediary censorship complicates matters further; sites are often outside of a country’s jurisdiction and thus it’s difficult to enlist their help. China has managed to work their way around this, such as by convincing Google in 2004 to filter results.
Things to keep our eyes on: Increases in filtering in Western countries, with focuses on child pornography. “We’re still waiting to see what Australia is going to do.”
“None of the things I’ve mentioned fully address the issues of content regulation,” says Faris. Filtering lists are almost never available for comment, filtering lists are often overseen by private companies. In countries that have adopted strict controls, controls are usually carried out by executive decree with little involvement by judiciary.
“One thing that every country has in common is that they struggle with the various elements of regulatory policy.” The political costs of carrying out filtering as well as the political costs of inaction. In some cases, governments err on the side of openness, in other places, the side of caution.
Faris mentions that the role of intermediaries, such as social media sites, is increasing over time. As more speech takes place on these platforms, speech is bound by TOS; in some cases, TOS are more restrictive than government restrictions (in other places, less restrictive). In either case, the overlap is sometimes troubling.
“As the Internet is increasingly international, we increasingly need to look for international solutions,” concludes Faris. We will next hear from colleagues from Kyrgyzstan, Pakistan, and India.