Facebook’s removal of posts, banning of accounts, and other issues have been a source of frustration for many marketers for a very long time. While they had public rules that let people know what they could and could not post, this was fairly vague and it was nearly impossible to tell what would get flagged and removed before posting it. The ‘community standards’ guides that they published were pretty unhelpful, especially when they had the full version available only for internal use.
Likely due to a lot of pushback on their policies recently, they are releasing the full set of rules and regulations that are used by their team internally to the public. In addition, with the release of this information, they will allow people to appeal a decision related to taking down of individual pieces of content. In the past, this was only allowed when a full account, group, or page was removed.
The document is quite extensive in how their team of more than 7500 moderators decides what content will be removed. For example, videos that talk about medical drugs are fine, but someone posting that they take non-medical drugs will be removed.
A video of the results of a violent attack will likely be removed if it is sufficiently ‘disturbing’ but will likely be allowed if it is posted as information from a medical setting (for example, a doctor treating the injury). The video in question, if allowed, may have a warning screen up that users would have to click to see the video.
While this won’t likely eliminate all issues related to marketing on Facebook, it is certainly a good start. Letting people know precisely how their moderators look at posts will help ensure everyone is on a level playing field. In addition, the option to appeal these decisions will be a most welcomed change.