Facebook’s secret rules and guidelines for deciding what its 2 billion users can post on the site are revealed for the first time in a Guardian investigation that will fuel the global debate about the role and ethics of the social media giant.
The Guardian has seen more than 100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm.
The Guardian has seen documents supplied to Facebook moderators within the last year. The files tell them:
- Remarks such as “Someone shoot Trump” should be deleted, because as a head of state he is in a protected category. But it can be permissible to say: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “fuck off and die” because they are not regarded as credible threats.
- Videos of violent deaths, while marked as disturbing, do not always have to be deleted because they can help create awareness of issues such as mental illness.
- Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or “actioned” unless there is a sadistic or celebratory element.
- Photos of animal abuse can be shared, with only extremely upsetting imagery to be marked as “disturbing”.
- All “handmade” art showing nudity and sexual activity is allowed but digitally made art showing sexual activity is not.
- Videos of abortions are allowed, as long as there is no nudity.
- Facebook will allow people to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress”.
- Anyone with more than 100,000 followers on a social media platform is designated as a public figure – which denies them the full protections given to private individuals.
Other types of remarks that can be permitted by the documents include: “Little girl needs to keep to herself before daddy breaks her face,” and “I hope someone kills you.” The threats are regarded as either generic or not credible.
Facebook’s leaked policies on subjects including violent death, images of non-sexual physical child abuse and animal cruelty show how the site tries to navigate a minefield.
The files say: “Videos of violent deaths are disturbing but can help create awareness. For videos, we think minors need protection and adults need a choice. We mark as ‘disturbing’ videos of the violent deaths of humans.”
Such footage should be “hidden from minors” but not automatically deleted because it can “be valuable in creating awareness for self-harm afflictions and mental illness or war crimes and other important issues”.
The files show Facebook has issued new guidelines on nudity after last year’s outcry when it removed an iconic Vietnam war photo because the girl in the picture was naked.
It now allows for “newsworthy exceptions” under its “terror of war” guidelines but draws the line at images of “child nudity in the context of the Holocaust”.
Facebook told the Guardian it was using software to intercept some graphic content before it got on the site, but that “we want people to be able to discuss global and current events … so the context in which a violent image is shared sometimes matters”.
Originally published on The Guardian
0 comments:
Post a Comment