Jillian C. York

Jillian C. York is a writer and activist.

Month: October 2014

scars (first draft)

Where walls once stood I walk,
I walk with purpose past
near-broken homes, edges sharp with resentment,
nothing but words holding them ever so tenuously together.
I move through invisible structures of
concrete to my home, a home once inhabited
by a resistor
or a poet
or more likely a poor family, asleep in my closet, in my kitchen,
the space between my table and chair.
I walk the line, the line drawn on the pavement to remind us that,
here, a wall once stood, a wall that split human from human,
a wall that could have split me from you.
I walk the line, I walk the line from here to Palestine,
a line made of
barbed wire and
hearts broken
a wall made of ideas,
of lost hope,
of pain.
I walk the line between anger and pity,
pity for the chances of birth, that separate
soldier from boy
white from black
one from another
me from you.
I walk a line that is well-trodden by
those who wring their hands and ask
“why can’t we all just get along?”
I am one of them,
born on the wrong side of the wall as empathy,
born on the side where to feel is a privilege,
to love is in isolation,
to hurt is another man’s game.
But our structural integrity is weak, we are
crumbling, pushed down by the thronged masses tired of our tyranny of
standing there
saying nothing.
Where walls still stand, I walk.
Pace with anger, but
hope too.
Knowing that these walls can’t stand forever, that
someday they’ll be torn down.

Remarks at the Humboldt Institute for Internet & Society’s Meeting on Internet Governance

These are, roughly, my remarks presented on a panel discussion of content governance at the Humboldt Institute, on October 10, 2014.

Who are the private actors involved in Internet governance? Internet service providers, software companies, content providers. I’m going to focus primarily on the latter, because of the impact I believe that they have on our public spheres and on governance.

As Marianne Franklin said earlier, these corporations derive their legitimacy from their property rights. These are private companies with their own sets of rights. As many are based in the United States, they are also imbued with their own speech rights. This matters, for reasons I’ll explain in a moment.

These are also truly global platforms. Centralised, free, and easy to sign up for, these sites attract a broad swath of the world’s public, who use them to engage in political and social debate, organise protests, and of course, chat with each other.

Social media companies are private, but the platforms they have created have taken on the role of the public sphere as described by Habermas: “Society engaged in critical public debate” — and are characterised by a feeling of inclusivity and freedom of expression and association. And yet the online social spaces standing in for the public sphere are private ones, owned by billionaires and shareholders. Nevertheless, we treat them as public spaces.

The way that content is regulated can be algorithmic, or it can be community-based; in the latter example, users of these sites utilize report mechanisms to police the speech of members in their network, or in their site.

Scholars Kate Crawford and Tarleton Gillespie have studied the use of community reporting, or “flagging,” on social media platforms. In a recent paper, they describe the mechanism as “understood [by many sites] to be a genuinely important and necessary part of maintaining user-friendly spaces and learning from their community.” Flagging, they argue, “act[s] as a mechanism to elicit and distribute user labor—users as a volunteer corps of regulators.”

It is this mechanism that, a few weeks ago, enabled a single user to “troll” dozens of drag queens and other LGBTQ-identified individuals, reporting them for using “fake” names, a violation of Facebook’s rules. And why does that rule exist? Because Mark Zuckerberg doesn’t like the idea of people having more than one identity.

What this means for users, netizens, is that the spaces in which they are engaged in public discussion on a daily basis are not subject to the law, but rather, the whims of private companies owned and run by mostly-white, mostly-male, mostly-American billionaires. Seriously, take a look at the diversity reports released by these companies in the past few months. Facebook is 69% male, 57% white. Google is 70% male, 61% white. And so on.

I emphasize race and gender here because I think it matters. The rules that we’re subject to on these sites often appear to reflect the morals and values of these stakeholders: Nudity is banned, violence is okay. Hate speech against certain groups is unacceptable, but imagery depicting violence against women gets a pass. Pseudonyms are okay for celebrities, but not for transgender persons.

This is where multistakeholderism doesn’t matter. These companies have been hearing from civil society, and sometimes from governments, about their policies for years, but to little progress. Yes, we’ve seen some changes—such as a change to Google+’s policy on pseudonyms—but they didn’t come from the IGF; in fact, I sat on a panel at the 2012 IGF with a Googler who blatantly lied to me on the record about a specific censorship incident. Rather, they came from public pressure, from activism, from civil society.

On the other hand, it’s worth noting that while the multistakeholder process has little to no effect on the behavior of companies, corporations have a disproportionate voice in the process. They spend loads of money on lavish events held in big tents, they pay for multistakeholder “corporate social responsibility” organizations that spend more time putting out reports that don’t get read than actually fixing problems, and they tempt civil society actors and organizations with large sums of money.

Of course, we can’t talk about all of this without talking about the role of governments, either. While the rules that corporations enact on our speech (and our privacy) are problematic, even more problematic are the ways in which corporations capitulate to governments. We’re all familiar with the transparency reports that show legal orders received by these companies, but what about the ways in which extralegal pressure is applied? Remember the time the State Department called YouTube and asked them to remove a video that was insulting to Muslims? YouTube didn’t fully comply, but they did remove the video in two countries, triggering numerous other countries to submit legal orders. Earlier this year, Twitter removed content at the behest of what turned out to be an invalid Pakistani legal order. Just this week, EU leaders met with major companies like Google and Twitter to pressure them to enact proactive measures to censor terrorist content. And as we speak, a US government/corporate partnership is astroturfing in the Middle East with a new set of watered-down, cultural relativist “rights and principles” that they will try to parade around the Arab IGF next month.

I should disclose that I tend toward free speech absolutism and have repeatedly argued for corporations to pull back their regulation of speech to align with the law in the jurisdictions in which they are based. But that isn’t the point. The point here is that, increasingly, the multistakeholder process is completely irrelevant to our speech rights on the Internet.

I say this in the hope of provoking a good discussion, but also because I truly believe it. I’ve watched for years as the IGF holds discussions on issues of free speech with no outcomes. I’ve been a part of the Global Network Initiative, and my organization walked out last year – because of a lack of transparency, a lack of achievements, and the fact that both the GNI and most of its civil society and academic members receive significant Google funding, arguably a strong conflict of interest. I’ve heard from civil society organizations that are afraid to speak out against Google for fear of losing one of their main sources of funding. I don’t believe Google money inherently invalidates your argument, but I also don’t think we can dismiss the silencing effect that it has.

So when we talk about content, what we are really talking about is the spaces in which we are allowed, able to speak. And when we talk about content in the context of Internet governance and multistakeholderism, we are talking about control: By governments, yes, but also, by companies. And at the moment, we are in the midst of an era where corporations hold an extraordinary amount of power over our privacy, our right to association, and our speech.

We need a solution to this problem, and I won’t profess to have it, in full. But I do have a few recommendations:

First, we need to take a realistic look at where content control is happening. Many NGOs, and particularly the Global Network Initiative, are focused entirely on the ways in which government regulates speech, whether through laws and censorship or through legal orders to companies.

We also need to take a realistic look at how companies are behaving in our spaces. As I said before, Google funding (for example) does not automatically invalidate your argument, but I don’t think enough of us are asking how much their and other companies’ presence in our civil society spaces is impacting our advocacy.

Finally, we must hold corporations accountable for their behavior in multistakeholder processes. Are they shutting down conversations? Are they lying on panels? We cannot be afraid to call that out where we see it. And we must never let our funding silence us.

Facebook responds to the “real name” problem

As you know if you’ve ever read this blog, Facebook has a serious problem with users abusing its reporting mechanisms, particularly when it comes to the “real name” policy. After a recent incident (well-documented by many) involving LGBT performers, the company has finally responded. But the response, of course, is weak. Here’s a response from Chris Cox, a Facebook staffer, with my comments in-line (bold is mine):

I want to apologize to the affected community of drag queens, drag kings, transgender, and extensive community of our friends, neighbors, and members of the LGBT community for the hardship that we’ve put you through in dealing with your Facebook accounts over the past few weeks.
In the two weeks since the real-name policy issues surfaced, we’ve had the chance to hear from many of you in these communities and understand the policy more clearly as you experience it. We’ve also come to understand how painful this has been. We owe you a better service and a better experience using Facebook, and we’re going to fix the way this policy gets handled so everyone affected here can go back to using Facebook as you were.

The way this happened took us off guard. An individual on Facebook decided to report several hundred of these accounts as fake. These reports were among the several hundred thousand fake name reports we process every single week, 99 percent of which are bad actors doing bad things: impersonation, bullying, trolling, domestic violence, scams, hate speech, and more — so we didn’t notice the pattern. The process we follow has been to ask the flagged accounts to verify they are using real names by submitting some form of ID — gym membership, library card, or piece of mail. We’ve had this policy for over 10 years, and until recently it’s done a good job of creating a safe community without inadvertently harming groups like what happened here.

False. As I’ve been documenting for nearly five years, this is a serious problem that affects users around the world.

Our policy has never been to require everyone on Facebook to use their legal name. The spirit of our policy is that everyone on Facebook uses the authentic name they use in real life. For Sister Roma, that’s Sister Roma. For Lil Miss Hot Mess, that’s Lil Miss Hot Mess. Part of what’s been so difficult about this conversation is that we support both of these individuals, and so many others affected by this, completely and utterly in how they use Facebook.

Also false. Facebook’s actual policy states: “What names are allowed on Facebook? … The name you use should be your real name as it would be listed on your credit card, driver’s license or student ID”. Users who are reported are forced to submit (insecurely, no less) this information.

We believe this is the right policy for Facebook for two reasons. First, it’s part of what made Facebook special in the first place, by differentiating the service from the rest of the internet where pseudonymity, anonymity, or often random names were the social norm. Second, it’s the primary mechanism we have to protect millions of people every day, all around the world, from real harm. The stories of mass impersonation, trolling, domestic abuse, and higher rates of bullying and intolerance are oftentimes the result of people hiding behind fake names, and it’s both terrifying and sad. Our ability to successfully protect against them with this policy has borne out the reality that this policy, on balance, and when applied carefully, is a very powerful force for good.

What can I say? Facebook believes this to be true, but a large amount of the abuse on the site occurs with people using their real names. Or names that look like real names. Whatever, they’re only reported when they look “fake,” which is the failure of their system.

All that said, we see through this event that there’s lots of room for improvement in the reporting and enforcement mechanisms, tools for understanding who’s real and who’s not, and the customer service for anyone who’s affected. These have not worked flawlessly and we need to fix that. With this input, we’re already underway building better tools for authenticating the Sister Romas of the world while not opening up Facebook to bad actors. And we’re taking measures to provide much more deliberate customer service to those accounts that get flagged so that we can manage these in a less abrupt and more thoughtful way. To everyone affected by this, thank you for working through this with us and helping us to improve the safety and authenticity of the Facebook experience for everyone.

Well good. Finally. I guess it took Americans being affected for them to care.

© 2018 Jillian C. York

Theme by Anders NorenUp ↑