Facebook Shuts 583 Million Fake Accounts That Shows Scale of Spam, Hate Speech And Violence In First Three Months Of 2018

Facebook axed 583 million fake accounts in the first three months of 2018, the social media giant said Tuesday, detailing how it enforces “community standards” against sexual or violent images, terrorist propaganda or hate speech.

In its first quarterly Community Standards Enforcement Report, Facebook said the overwhelming majority of moderation action was against spam posts and fake accounts: it took action on 837m pieces of spam, and shut down a further 583m fake accounts on the site in the three months. But Facebook also moderated 2.5m pieces of hate speech, 1.9m pieces of terrorist propaganda, 3.4m pieces of graphic violence and 21m pieces of content featuring adult nudity and sexual activity.

“This is the start of the journey and not the end of the journey and we’re trying to be as open as we can,” said Richard Allan, Facebook’s vice-president of public policy for Europe, the Middle East and Africa.

The amount of content moderated by Facebook is influenced by both the company’s ability to find and act on infringing material, and the sheer quantity of items posted by users. For instance, Alex Schultz, the company’s vice-president of data analytics, said the amount of content moderated for graphic violence almost tripled quarter-on-quarter.

One hypothesis for the increase, Schultz said, is that “in [the most recent quarter], some bad stuff happened in Syria. Often when there’s real bad stuff in the world, lots of that stuff makes it on to Facebook.” He emphasised that much of the moderation in those cases was “simply marking something as disturbing”.

Several categories of violating content outlined in Facebook’s moderation guidelines – including child sexual exploitation imagery, revenge porn, credible violence, suicidal posts, bullying, harassment, privacy breaches and copyright infringement – are not included in the report.

On child exploitation imagery, Schultz said that the company still needed to make decisions about how to categorise different grades of content, for example cartoon child exploitation images.

“We’re much more focused in this space on protecting the kids than figuring out exactly what categorisation we’re going to release in the external report,” he said.

Facebook also managed to increase the amount of content taken down with new AI-based tools which it used to find and moderate content without needing individual users to flag it as suspicious. Those tools worked particularly well for content such as fake accounts and spam: the company said it managed to use the tools to find 98.5% of the fake accounts it shut down, and “nearly 100%” of the spam.

Sarcasm Needs Human Touch

“It may take a human to understand and accurately interpret nuances like… self-referential comments or sarcasm,” the report said, noting that Facebook aims to “protect and respect both expression and personal safety”.

Facebook took action against 2.5 million pieces of hate speech content during the period, a 56 increase over October-December. But only 38 percent had been detected through Facebook’s efforts — the rest flagged up by users.

The posts that keep the Facebook reviewers the busiest are those showing adult nudity or sexual activity — quite apart from child pornography, which is not covered by the report.

Some 21 million such posts were handled in the period, a similar number to October-December 2017.

That was less than 0.1 percent of viewed content — which includes text, images, videos, links, live videos or comments on posts — Facebook said, adding it had dealt with nearly 96 percent of the cases before being alerted to them.

Facebook has come under fire for showing too much zeal on this front, such as removing images of artwork tolerated under its own rules.

In March, Facebook apologised for temporarily removing an advert featuring French artist Eugene Delacroix’s famous work “Liberty Leading the People” because it depicts a bare-breasted woman.

Facebook’s head of global policy management Monika Bicket said the group had kept a commitment to recruit 3,000 more staff to lift the numbers dedicated to enforcing standards to 7,500 at the start of this year.

Comments

comments