On 11/15/18, Mark Zuckerberg released a statement that outlined how Facebook will approach content management moving forward. Specifically, the purpose of the post was to let users know how they have historically found and removed objectionable content. He further elaborated with what they plan to do moving forward to tighten controls on their application of their community standards policy.
You can read all about my take on this announcement here, but in the meantime, it’s important that you know that Facebook has identified 18 areas of content that they have banned from the platform. If you’d like to read more about each of these, you can find details on Facebook’s Community Standard page.
- Threats or coordination of violence against people or groups
- Organizations devoted to terrorist activity, hate, mass or serial murder, human trafficking, organized violence or criminal activity or their prominent members.
- Promoting, advocating, or publicizing crime for the purpose of glorification
- Purchase, sell, or trade, non-medical drugs including pharmaceuticals and marijuana.
- Adult content, including nudity and sexually explicit content.
- Bullying or Harassment designed to target victims of serious harm.
- Hate Speech
- Graphic and violent content that glorifies child abuse, death and dismemberment, or bodily injury.
- Spam, false advertising, and fraud
- Misrepresenting yourself – which essentially amounts to identity theft
- Fake News
- Intellectual property
- Human Trafficking
- Sexual exploitation of children
- Selling of firearms, their component parts, or any ammunition between users.
- Any content that identifies or targets victims or survivors of self-injury including suicide attempts, self-mutilation, and eating disorders. (Facebook’s resources for this group) https://www.facebook.com/safety/wellbeing/suicideprevention
- Violating the privacy of any vulnerable individual
- Material that incites violence