Call this my quarterly (or thereabouts) post on the issue.
Since I began tracking instances of Facebook deactivating user accounts and deleting content from existing profiles and groups, I’ve found that the practice has not waned at all; if anything, it has increased.
I first wrote about this in April, when I reported on Moroccan atheist activist Najat Kessler’s Facebook account being deleted for “using a fake name” (guess what? Najat Kessler is her real name). In the three months since, I’ve received countless e-mails from users, some of whose stories I’ve further publicized or talked about at events such as the Global Voices Summit and the Al Jazeera Global Media Forum. Rebecca MacKinnon has also been blogging about these issues, but aside from the two of us, it’s mostly been crickets…until now.
Late last week, things hit a fever pitch when Facebook deactivated a group calling for the boycott of oil giant BP. The deactivation caught the wind of media, likely because a) BP is a hot topic right now and b) the group had 750,000 members. In any case, CNN reported it, with a statement from Facebook which said, curiously, that the group was likely “disabled by our automated systems.”
Interestingly enough, in a comment on Rebecca’s blog, Facebook rep Barry Schnitt previously claimed that Facebook uses automation only to disable accounts of spammers. Then again, Barry Schnitt also claimed that Facebook offers an official appeals process (hint: no, they don’t).
The BP story isn’t the only one gaining traction in the blogosphere. In a recent blog post, Greg Butterfield outlines the various ways in which Facebook goes after progressive groups. Though I strongly disagree with that particular premise (as there’s plenty of evidence that Facebook “goes after” well, anybody), Butterfield’s post enumerates other recent examples of groups deactivated from the platform, such as:
- A group advocating for the release of FARC member Ricardo Palmera
- A PFLP solidarity group based in New Zealand
- The Boycott BP group
I can add various others:
- The group of a performance artist, because it happened to contain the word “Hamas”
- A Hong Kong based group set up for remembrance of the Tiananmen Square Massacre
- A group calling for the separation of mosque and state in the Arab world
- The personal accounts of activists Sabina England and Najat Kessler, for not using their real names (even when they did)
Meanwhile, Facebook allows groups like “Draw Mohammad Day” to stay up for weeks at a time. Mind you, I support all free speech and thus did not advocate for the removal of that group, however, a number of the comments on the page quite clearly violated Facebook’s TOS, even to my non-legal eyes.
There are numerous problems with Facebook’s processes, some of which I’ve noted before. I’ve thought through recommendations various times, and each time come up with at least something new.
First and foremost, Facebook’s appeals process needs some serious work. I understand that they’re a huge company that hosts thousands of groups on their platform, but that’s no excuse for not developing a streamlined appeals process that can accommodate their broad user base (read: multilingual). Users who attempt to appeal are currently met with an automated message telling them that they cannot appeal. Even if, in the end, some of these users are able to get their accounts back, the e-mail is misleading at best.
And how about a warning system? I’m all for clear rules, and for deleting users who can’t adhere to them, but in many cases, users don’t realize they’re violating the TOS (Rebecca’s example of Rafik Dammak is a great example).
Facebook would also be wise to set up a human rights department, as Google and Yahoo before it have done. They could then build clear standards around the consideration of context when policing content on the site.
Of course, no site will get it 100% right, but some (read: YouTube) have come damn close. Based on various conversations and bits of intel, it doesn’t seem like Facebook is doing all that much to solve these problems (and if they are, they’re doing it very opaquely).