At an event a few weeks ago I had the chance to publicly ask Adam Conner of Facebook why, if their service is offered in 70+ languages, their terms of service are only available in 7 and, with that in mind, how they feel they can accurately apply their TOS to people who use the site in a foreign language.  Mostly avoiding my question, Conner responded by saying Facebook is working to improve their mechanisms but that, and I quote, “sometimes dolphins get caught in the tuna net.”

An awkward analogy, given that most tuna has been dolphin safe for half of my lifetime (and, as my friend Rebekah Heacock pointed out, making it so was one of the first Internet-related campaigns).  Conner didn’t bother to address the rest of the issue, which is that a user in Iran, who does not speak English, cannot consent to the terms of service but can use the site in Persian…Thus, if he’s deactivated for breaking the TOS, not only does he have no idea why, but undoubtedly, neither do Facebook’s non-Persian-speaking staff, since Facebook’s system is based on user reporting with arguably little human oversight.

The issue of human oversight came to the forefront of the discussion today, as it was reported that a note on Sarah Palin’s Facebook wall was accidentally deleted.  According to Facebook spokesman Andrew Noyes, the not “did not violate [Facebook's] content standards but was removed by an automated system.”

Automated system again, Facebook?  But lo, I thought you evaluated all content?  Guess not.

Interestingly, Palin’s post was targeted for removal by a group claiming that it contained hate speech (ed. note: techPresident has more details on how that happened).  As most of my readers most certainly know by now, this happens all the time: a group of people target a Facebook group or person for account removal by encouraging many to report the content in question.  My hypothesis, which this story seems to prove correct, is that once enough reports have been made, something within Facebook’s systems is triggered, causing the content to be removed.

This is not completely dissimilar from other systems; when enough reports are made on a piece of YouTube content, the content is sent into a queue (but not removed), to be viewed by Google’s human content reviewers.

What apparently distinguishes Facebook is that, rather then going into a review queue, content is simply removed.  Attempts to appeal the content typically result in refusal to reinstate it, unless of course you’re Sarah Palin, or your story gets picked up by CNN.

My advice to activists at this point: avoid Facebook at all costs.  They don’t care about you.