Rob Faris concluded his speech by introducing three panelists: Tattu Mambetalieva of the Civil Initiative on Internet Policy (Kyrgyzstan) and the OpenNet Initiative; Sunil Abraham of the Centre for Internet and Society (India); and Shahzad Ahmad of Bytes for All (Pakistan).

Tattu (I’ll be referring to her by first name for the sheer fact that it’s easier to type) begins by speaking about the history of Internet in Kyrgyzstan; Internet began to flourish with the competition of ISPs across the country.  Kyrgyzstan has a population of 5 million people; only 800,000 use the Internet.  “We’re moving in the right direction, but we have many new challenges,” says Tattu, “The price for Internet is extremely high for most.”  Filtering policies in Kyrgyzstan are similar to those in Kazakhstan.  “In the past couple of years, we have attempts to make Internet equal to mass media.”  The CIS countries all control the Internet in a number of ways.  “For instance, intelligence,” says Tattu.

Sunil Abraham speaks up to respond to Rob Faris’s comments: “In America, there’s only 1% of Muslims, in India only 1% of Christians.  Yet, we have a lot of national channels with Christian preachers, but in America, there are no national channels featuring Muslims.”  He wonders if Internet filters are biased and asks: Is Internet freedom a human rights issue or a trade issue?  “Access to knowledge is a precondition of freedom of expression,” Abraham says.  “It’s all fine to talk about free expression in the political realm, but as soon as it influences intellectual property, we all have a problem with it.”

“In India,” Abraham says, “There is a preference for male children, and both abortion of girls and infanticide are common” (he calls it “the daughter deficit).  The Indian government has asked Google, Microsoft, etc, to block sex determination kits from advertising and related searches.  Civil society finds this hard to resist.

Abraham mentions Savita Babhi comic, one of the first websites to be filtered in India.  Rather than block porn, the government went after the comic.

Shahzad Ahmad notes that European governments have a lot of stake in the day-to-day processes; why don’t these governments raise their voices when other governments place bans on freedom of expression or activists organizing for their rights?  “Probably it doesn’t suit them at that time,” says Ahmad, “they’re serving their own purposes.”

The censorship regime in Pakistan is bizarre; the first ever blocking incident in Pakistan was in February 2006 when Google’s Blogger platform was blocked for about two months.  “The weird thing about is that they always block content, but there is always a political reason behind it,” says Ahmad.  When the Pakistani government blocks YouTube or another platform, it’s always political.  The Pakistani government blocks at the domain level, the IP level; a number of mechanisms in place.

Rob Faris comments: “I think that Sunil’s points are well taken; that different countries frame this issues in very different ways and by their own cultural history as well as their current legal framework and that the U.S. probably spent more time talking about the freedom of religious and political speech and less time talking about global access….I think that we need to find international accommodation in many of these areas.”

Tattu again comments: Policies regarding political content is often the only thing controlled, while the rest of the Web is more or less free.

Renata Uitz of CEU comments: There is a diversity of mechanisms as well as a variety of intents when it comes to blocking.  “It would be nice to talk about an account of these seemingly local divergences and idiosynchrasies: how are some of the large providers trying to meet local requirements and protect themselves legally?”

“It would be interesting to see how aware various local speakers are of the international implications of the content being filtered.  Who is most vulnerable to self-censorship?” Uitz asks.

Tattu answers: “The most vulnerable are the countries that don’t have their own systems or satellites and must access the Internet through another country.”  Kyrgzstan’s Internet is filtered; citizens asked for less filtered access and we got it.  “Countries with overlapping Internet should agree on what to filter and what not to filter.”

Abraham mentions policies such as France’s 3-strike policy.  He also discusses how global Internet memes can influence filtering: The example of a South Africa meme on Twitter called #whatdarkiesaid was deemed offensive by Black American Twitter users, who thus complained to Twitter.  The hashtag was then blocked.

Abraham also mentions Facebook’s decision to delete photographs of breastfeeding mothers; something often considered offensive in the US but commonplace elsewhere.

Abraham suggests the user communities in various countries need to speak up.

Ahmad speaks: “A major issue is hate speech.  I don’t know how to control it but it’s a big issue, particularly in Pakistan.”  He mentions the Draw Mohammed Page controversy; in this case, Facebook didn’t take down the page, when in a number of other cases it did remove pages.  Ahmad notes that in Pakistan, this is a major issue.  “There shouldn’t be double standards; Facebook should’ve responded to this if they respond to other issues of hate speech.”

Ahmad also notes that, as a result of the Facebook page, Pakistan blocked the entire domain, as opposed to simply the page.  YouTube was later blocked as well. “Who suffered?”, he asks rhetorically: “All of the students, NGOs, and citizens who use Facebook and other social networking sites for work, school, and activism.”

Ahmad notes that double standards from companies create real problems for people on the ground.

Uitz asks: “How important do you think it is to make judges and decision-makers aware of the technical aspects of blocking content that might be considered vicious or harmful?” (note: Pakistan claimed inability to block only a Facebook page, thus overblocking the entire site; this can be a technical issue).  “Who is best positioned to inform judges and decision-makers?”

Ahmad speaks up: On 19 May, Pakistani government issued a total ban of Facebook.  Pakistan has about 18 million Internet users.  When the case went to court, the court advised the government to take the lead from China and Saudi Arabia and implement more stringent filtering mechanisms.  The judiciary and lawyers don’t understand how such things function, says Ahmad.  Their lack of knowledge can create further problems.

Abraham: “This is large and complex problem.”  Microsoft organizes trainings for Indian judges in cyberlaw.  Large Western interests are in the business of training judges, which can be problematic.

Tattu: “We have a case where Uzbekistan asked Yahoo! to hand over user data.”  In this case, data was handed over and people were imprisoned.  If there had been a policy not to reveal data, people would be safer.  Nowadays, major players (such as Yahoo!, Yandex) need to create their own policies.  It’s difficult for such companies to understand the issues in each country; players must then set the rules.

The floor is now being opened to the audience; I’ll be tweeting over at @jilliancyork, back for another panel later.