Sometimes an emoji is just an emoji. Sometimes it may be a threat.
And with only a few seconds to spare, Facebook moderators have to make the call — even if the text that accompanies the laughing yellow face is in an unfamiliar language.
To help with those decisions, Facebook has created a list of guidelines for what its two billion users should be allowed to say. The rules, which are regularly updated, are then given to its moderators.
For Facebook, the goal is clarity. But for the thousands of moderators across the world, faced with navigating this byzantine maze of rules as they monitor billions of posts per day in over 100 languages, clarity is hard to come by.
Facebook keeps its rulebooks and their existence largely secret. But The New York Times acquired 1,400 pages from these guidelines, and found problems not just in how the rules are drafted but in the way the moderation itself is done.
Facebook对规定手册的存在大体上持保密态度。但《纽约时报》(The New York Times)获取了该公司1400页的指导方针,发现问题不仅在于这些规定的起草方式,也在于审核方式。
Here are five takeaways from our story:
Facebook is experimenting on the fly.
The rules are discussed over breakfast every other Tuesday in a conference room in Menlo Park, Calif. — far from the social unrest that Facebook has been accused of accelerating.
Though the company does consult outside groups, the rules are set largely by young lawyers and engineers, most of whom have no experience in the regions of the world they are making decisions about.
The rules they create appear to be written for English speakers who at times rely on Google Translate. That suggests a lack of moderators with local language skills who might better understand local contexts.
他们制定的规定似乎是为讲英语、偶尔依赖谷歌翻译(Google Translate)的人所撰写的。这意味着缺乏拥有当地语言技能的审核员,而这种人可能会更好地理解当地背景。
Facebook employees say they have not yet figured out, definitively, what sorts of posts can lead to violence or political turmoil. The rulebooks are best guesses.
The rules contain biases, gaps and errors.
Some of the rules given to moderators are inaccurate, outdated or missing critical nuance.
One presentation, for example, refers to the Bosnian war criminal Ratko Mladic as a fugitive, though he was arrested in 2011.
比如,在一份简报中,称波斯尼亚战犯拉特科·姆拉迪奇(Ratko Mladic)在逃,但他已于2011年被捕。
Another appears to contain errors about Indian law, advising moderators that almost any criticism of religion should be flagged as probably illegal. In fact, criticizing religion is only illegal when it is intended to inflame violence, according to a legal scholar.
In Myanmar, a paperwork error allowed a prominent extremist group, accused of fomenting genocide, to stay on the platform for months.
The moderators feel overwhelmed.
Facebook outsources moderation to companies that hire the thousands of workers who enforce the rules. In some of these offices, moderators say they are expected to review many posts within eight to 10 seconds. The work can be so demanding that many moderators only last a few months.
The moderators say they have little incentive to contact Facebook when they run across flaws in the process. For its part, Facebook largely allows the companies that hire the moderators to police themselves.
Facebook is edging into countries’ politics.
Facebook is growing more assertive about barring groups and people, as well as types of speech, that it believes could lead to violence.
In countries where the line between extremism and mainstream politics is blurry, the social network’s power to ban some groups and not others means that it is, in essence, helping pick political winners and losers.
Sometimes it removes political parties, like Golden Dawn in Greece, as well as mainstream religious movements in Asia and the Middle East. This can be akin to Facebook shutting down one side in national debates, one expert argues.
有时候,它会移除政党,比如希腊的金色黎明党(Golden Dawn),还会移除亚洲和中东的主流宗教运动。一名专家认为,这类似于Facebook停掉了全国辩论中的一方。
Some interventions are more subtle. During elections in Pakistan, it told moderators to apply extra scrutiny to one party, but called another “benign.”
And its decisions often skew in favor of governments, which can fine or regulate Facebook.
Facebook is taking a bottom-line approach.
Even as Facebook tries to limit dangerous content on its platform, it is working to grow its audience in more countries.
That tension can sometimes be seen in the guidelines.
Moderators reviewing posts about Pakistan, for example, are warned against creating a “PR fire” by doing anything that might “have a negative impact on Facebook’s reputation or even put the company at legal risk.”
And by relying on outsourced workers to do most of the moderation, Facebook can keep costs down even as it sets rules for over two billion users.