Jillian C. York

Jillian C. York is a writer and activist.

Tag: censorship (page 1 of 8)

Thoughts on Twitter’s Latest Move

Today, Twitter announced a new system that will allow the company to geolocationally block (or, to use their terms, “withhold”) specific tweets in specific countries. On the company blog, Twitter explained:

We haven’t yet used this ability, but if and when we are required to withhold a Tweet in a specific country, we will attempt to let the user know, and we will clearly mark when the content has been withheld. As part of that transparency, we’ve expanded our partnership with Chilling Effects to share this new page, http://chillingeffects.org/twitter, which makes it easier to find notices related to Twitter.

It’s been difficult to comment on the move given the extreme reaction by Twitter’s own community. Lots of “I told you so” from the conspiracy theorists who think that this is because of Saudi Prince Alwaleed’s stake in the company, compounded by the #occupy crowd continuing to claim their hashtag was censored in Twitter’s trending topics made me want to avoid the subject entirely. But alas.

Let’s be clear: This is censorship. There’s no way around that. But alas, Twitter is not above the law.  Just about every company hosting user-generated content has, at one point or another, gotten an order or government request to take down content.  Google lays out its orders in its Transparency Report.  Other companies are less forthright.  In any case, Twitter has two options in the event of a request: Fail to comply, and risk being blocked by the government in question, or comply (read: censor).  And if they have “boots on the ground”, so to speak, in the country in question?  No choice.

In the event that a company chooses to comply with government requests and censor content, there are a number of mitigating steps the company can take.  The most important, of course, is transparency, something that Twitter has promised.  Google is also transparent in its content removal (Facebook? Not so much).  Twitter’s move to geolocate their censorship is also smart, given the alternative (censoring it worldwide, that is) – particularly since it appears a user can manually change his or her location.

I understand why people are angry, but this does not, in my view, represent a sea change in Twitter’s policies.  Twitter has previously taken down content–for DMCA requests, at least–and will no doubt continue to face requests in the future.  I believe that the company is doing its best in a tough situation…and I’ll be the first to raise hell if they screw up.

7 things you might soon be able to say on television

Via the Center for Democracy and Technology:

Today the Supreme Court will hear arguments in FCC v. FOX to determine whether regulation of “indecent” content on broadcast television violates the First Amendment. This case has been up to the Supreme Court before; in 2009, the Court held that the FCC’s decision to fine FOX for broadcasting profanity (called “fleeting expletives”) during live award shows (the 2002 and 2003 Billboard Music Awards) was not “arbitrary and capricious”, and so did not violate the Administrative Procedures Act that governs how federal agencies can make and change their policies. (CDT filed a brief in both the 2009 and 2011 cases.)

This time around, the Court is addressing a different question: whether FCC regulation of indecent (but not illegal) over-the-air content is consistent with the First Amendment. In the 1978 Pacifica case, the Court held that because broadcast media was “uniquely pervasive” in American culture, serving as the principle source for news and entertainment in a time before 500-channel cable packages, and acted as an uncontrollable “intruder” into the home, it was appropriate for the government to put some limits on what type of content could travel through the airwaves.

Regardless of whether this rationale made sense in 1978, it no longer applies in the media environment of 2012. As we argue in our coalition brief, the centrality of broadcast content has waned in the face of other content sources (including cable, video-on-demand, and the Internet). At the same time, parents have never had a greater ability to set their own limits and controls on the type of content they believe is most suitable for their families. As the court recognized in Reno v ACLU, user empowerment tools that give individuals the power to set their own content restrictions are a less restrictive means to achieve the goal of protecting children than broad government content regulations of constitutionally protected speech. As the Court considers the arguments it hears today, we urge it to consider the changed technological circumstances of the past three decades, and extend to broadcast content the same level of First Amendment protection afforded to other speech.

And with that, I give you this:

More on Internet Censorship in Libraries: ACLU vs. Salem Public Library

I haven’t set foot in a physical library for at least three years, so it’s somewhat amusing to me that I’m suddenly obsessed with the question of Internet censorship in libraries. And yet, it’s a vital discussion: As more of our resources go digital, ensuring that information in our libraries stays free and unfettered becomes increasingly important.

So, last week I posted about a debate in Los Angeles, as framed by the LA Times. Now, I’ll tackle a somewhat tangential issue: The use of commercial software by libraries, schools, and other government-funded entities and the implications of that usage.

Recently, the ACLU and the ACLU of Eastern Missouri filed suit against the Salem (MO) public library for unconstitutionally blocking access to websites discussing minority religions by improperly classifying them as “occult” or “criminal.” According to the ACLU:

Salem resident Anaka Hunter contacted the ACLU after she was unable to access websites pertaining to Native American religions or the Wiccan faith for her own research. After protesting to the library director, Glenda Wofford, portions of the sites were unblocked, but much remained censored. Wofford said she would only allow access to blocked sites if she felt patrons had a legitimate reason to view the content and further said that she had an obligation to report people who wanted to view these sites to the authorities.

Other sites blocked by the library’s Netsweeper software include the official webpage of the Wiccan church, the Wikipedia entry pertaining to Wicca, Astrology.com and The Encyclopedia on Death and Dying, which contains viewpoint-neutral discussions of various cultures’ and religions’ ideas of death and death rituals.

Let’s start here: CIPA, as I mentioned, requires the blocking of obscene content and content deemed ‘harmful to minors.’ The latter is problematic in its vagueness, particularly when dealing with libraries, where adults are commonly patrons of the Internet. Yes, CIPA allows for a user aged 18+ to request a site be unblocked, but that patron should not have to go to unreasonable lengths to make that happen. And if the allegations in the ACLU’s case are accurate, then Wofford seems to have gone far beyond the scope of her job in claiming that she was required to report the customer’s request.

But more problematic to me is the categorization of the sites requested. The ACLU case alleges that the Salem Public Library had classified sites about Wicca and Astrology as “occult” or even “criminal.” It’s unlikely, however, that Salem had anything to do with the former (or even, perhaps the latter). Rather, Salem likely bought their filtering software (in this case, from Canadian company Netsweeper) out of the box. If true, then all they had to do was choose which categories to block.

As you well know, this is a sensitive subject for me ever since Websense erroneously categorized this very site as pornography. I know how these things work: They’re part automated, part categorized by minimum wage staff. It’s a boring job and mistakes are bound to happen. But a whole occult category? What is this, 1956? But I digress…

What I find problematic is how much control these private companies–and particularly, though not only, Netsweeper–have over what we view. As Helmi Noman pointed out last year, Netsweeper’s categorization of Tumblr.com as a pornographic sites (apparently 50% of the pages hosted on Tumblr are pornographic in nature!) resulted in the blogging platform being blocked in four countries (and, most likely, the Salem Public Library too).

More alarming is that these filtering tools can easily be gamed. My blog was categorized by Websense as pornography because of one post with an outrageous amount of comment spam that included outlinks to porn sites. If I wanted to get your site blocked by the Salem Public Library, all I’d need to do is drop a bunch of porn on it, easy-peasy.

Again, I digress. And lest my line of commentary be perceived as too narrow, I stumbled upon a great post this afternoon by Jason Pitzl-Waters laying out the other implications of this case. Some of the comments are fascinating as well.

Older posts

© 2018 Jillian C. York

Theme by Anders NorenUp ↑