Social media companies have long been chastised for their handling of online harassment.
On Tuesday, Facebook revealed for the first time the frequency of bullying and harassment on its platform, claiming that such content was seen 14 to 15 times per 10,000 visits on the site in the third quarter.
In its quarterly content moderation report, the business, which recently changed its name to Meta, also stated that bullying and harassing content was viewed between 5 and 6 times per 10,000 views of content on Instagram.
Former employee and whistleblower Frances Haugen disclosed internal papers that included studies and talks regarding Instagram’s effects on teen mental health and whether Facebook’s platforms fuel divides, bringing the social media behemoth back into the limelight.
The records, first disclosed by the Wall Street Journal, have sparked calls for Facebook to be more open, as well as doubts about whether measures like prevalence provide a whole picture of how the business addresses abuses.
Facebook said that its bullying and harassment statistics only included cases in which the firm did not require further evidence, such as a user report, to determine if the content violated its policies.
They claimed that 59.4 percent of the 9.2 million pieces of content deleted from Facebook for violating its bullying and harassment policies were discovered proactively.
In a blog post, the company’s global head of safety, Antigone Davis, and product management director Amit Bhattacharyya wrote, “Bullying and harassment is a unique difficulty and one of the most complicated issues to handle since context is crucial.”