What you need to know about Facebook’s latest Community Standard Enforcement Report
Facebook has released its Community Standards Enforcement Report for October through December 2020.
The Community Standards and Enforcement Report tracks our progress and commitment to making Facebook and Instagram safe and inclusive. This quarterly report shares metrics on how we are doing at preventing and taking action on content that goes against our Community Standards while protecting our community’s safety, privacy, dignity and authenticity.
The latest report shows some positive strides towards improvements in prevalence, providing greater transparency and accountability around content moderation operations across different Facebook products. It includes metrics across 12 policies on Facebook and 10 policies on Instagram.
READ More: Ghana Movie Industry is going through Transition – FIPAG Prez.
During the 4th quarter 2020, on Facebook we took action on:
6.3 million pieces of bullying and harassment content, up from 3.5 million in Q3 due in part to updates in our technology to detect comments
6.4 million pieces of organized hate content, up from 4 million in Q3
26.9 million pieces of hate speech content, up from 22.1 million in Q3 due in part to updates in our technology in Arabic, Spanish and Portuguese
2.5 million pieces of suicide and self-injury content, up from 1.3 million in Q3, due to increased reviewer capacity
During the 4th quarter 2020, on Instagram we took action on:
5 million pieces of bullying and harassment content, up from 2.6 million in Q3 due in part to updates in our technology to detect comments
308,000 pieces of organized hate content, up from 224,000 in Q3
6.6 million pieces of hate speech content, up from 6.5 million in Q3
3.4 million pieces of suicide and self-injury content, up from 1.3 million in Q3 due to increased reviewer capacity
“Our goal is to get better and more efficient at enforcing our Community Standards. We do this by increasing our use of Artificial Intelligence (AI), by prioritizing the content that could cause the most immediate, widespread, and real-world harm, and by coordinating and collaborating with outside experts,.” said Kojo Boakye, Director of Public Policy, Africa.
Facebook plans to share additional metrics on Instagram and add new policy categories on Facebook. Efforts are also being made to externally audit the metrics of these reports while making the data more interactive so people can understand it better. We will continue to improve our technology and enforcement efforts to keep harmful content off of our apps.
For more information, please see here.