Facebook released its latest report Wednesday on the removal of harmful content from its services. Titled “Community Standards Enforcement Report,” the social giant said it removed more than 3.2 billion fake accounts between April and September. That almost doubled the number the social giant removed during the same period in 2019, when it removed more than 1.5 billion.
Facebook also said it removed 11.4 million pieces of hate speech, compared with 5.4 million in the same six-month period in 2018. According to Facebook, the new community enforcement action also include Instagram. “We want Facebook and Instagram to be places where people can express themselves and have a voice. To create the conditions where people feel comfortable expressing themselves, we must also preserve our community’s sense of safety, privacy, dignity and authenticity. That’s why we have Community Standards, which define what is and isn’t allowed on Facebook and Instagram,” the company said.
The company said it made progress in detecting child nudity and sexual exploitation on Instagram. According to the report, Facebook said the prevalence of content with violations of adult nudity dropped in Q2 and Q3 2019, due to improvements to itsproactive detection technology and adjustments to its methodology for measuring prevalence. The social giant removed more than 1.2 million pieces of content between April and September.