Facebook has released its Community Standards Enforcement Report for October (CSER) through December 2020. The CSER tracks the progress and commitment to making Facebook and Instagram safe and inclusive.
The quarterly report shared metrics on how Facebook is doing at preventing and taking action on content that goes against its community standards, while protecting the community’s safety, privacy, dignity and authenticity.
The latest report showed some positive strides towards improvements in prevalence, providing greater transparency and accountability around content moderation operations across different Facebook products. It includes metrics across 12 policies on Facebook and 10 policies on Instagram.
During the fourth quarter of 2020, Facebook took action on 6.3 million pieces of bullying and harassment content, up from 3.5 million in Q3, due to updates in its technology to detect comments; 6.4 million pieces of organised hate content, up from four million in Q3; 26.9 million pieces of hate speech content, up from 22.1 million in Q3, due to updates in its technology in Arabic, Spanish and Portuguese; and 2.5 million pieces of suicide and self-injury content, up from 1.3 million in Q3, due to increased reviewer capacity.
During the same period on Instagram, five million pieces of bullying and harassment content were detected, up from 2.6 million in Q3, and 308,000 pieces of organized hate content up from 224,000 in Q3.
Also, action was taken on 6.6 million pieces of hate speech content, up from 6.5 million in Q3, while 3.4 million pieces of suicide and self-injury content were detected, up from 1.3 million in Q3, due to increased reviewer capacity.