Facebook has released a report on the enforcement of its community standards.

Facebook has released a brand new Community Standards enforcement Report, which focuses on q4 2018 and Q1 2019 and shows how Facebook has been successful in detecting content that violates community rules.


The report contains information and metrics that show which classes are the most problematic in terms of violations of Facebook standards and how many cases are appealed and rejected and include reports on regulated merchandise standards.


Facebook has updated the list of violation classes, which currently includes 9 metrics:
Adult nudity and sexual activity; Bullying and harassment; Child nudity and sexual exploitation of children; Fake accounts; Hate speech; Regulated goods: drugs and firearms; Spam; Terrorist; Propaganda; Violence and graphic content.


The following graph shows very attention-grabbing reports measuring how successful Facebook is in detecting content that violates community policies:
How many violations per class were detected per period. What the proportion of detection was that Facebook had before the rules were signed by users. How many times users have appealed rejected content? How much content has been repaired by Facebook after revocation?


In some classes, Facebook has a nice success rate of detection—around 90-99.9% when it involves nudity, sexual subtitles, pornography, spam, and terrorism. However with bullying and harassment, Facebook identified solely 14 July of the 2.6 million cases. With hate speech, 65.4% of the 4 million cases were identified, as well as 69.9% of the 670,000 cases of regulated merchandise.


Another widespread breach of standards where Facebook is successful and acknowledges 99.8% of the 2.2 billion cases is the creation of fake accounts. In Comparison to the first quarter, false accounts are raised by 1 billion.


Facebook also deals with spam, removing over 1.76 billion pieces of content.