Facebook reported more than 20 million child sexual abuse images in 2020, more than any other company

Facebook.
A file image showing a Facebook app on a phone screen.

Facebook reported more than 20 million child sexual abuse images on its platform in 2020, according to a new report by the National Council for Missing and Exploited Children (NCMEC). 

According to the report released Wednesday, Facebook recorded 20,307,216 instances for child sexual exploitation on its platforms in 2020. The figures cover Instagram as well as the main Facebook site.

Insider first reported the figures in January, when Facebook confirmed the number. The full report has figures for other companies, and shows that Facebook made more than 35 times as many reports as the next company on the list, Google.

Facebook's platforms contain the vast majority of all child sexual content flagged to the NCMEC, which represent a 31% increase on the 16 million images reported to them by the platform in 2019. 

Facebook highlighted its proactive policies and use of technology to detect and remove child exploitation material in response to the increase. 

"Using industry-leading technology, over 99% of child exploitation content we remove from Facebook and Instagram is found and taken down before it's reported to us," said a spokesperson to Insider in January. 

Other sites remove material after it is found or flagged to them, but don't have proactive policies to find it. 

Following Facebook, the platforms with the most reports were:

  • Google with 546,704.
  • Snapchat with 144,095.
  • Microsoft with 96,776.
  • Twitter with 65,062.
  • TikTok with 22,692. 
  • Omegle (a video and text chat platform) with 20,265.

Mindgeek, the company that owns porn websites including PornHub, logged 13,229 reports. Last year a series of credit card companies severed ties with Pornhub after it was revealed by The New York Times' that the site was hosting child sexual exploitation videos. 

Facebook said in a blog post ahead of the release of the NCMEC report that it was building new tools to track down child sexual abuse material, and that most of the material it identified was old material being shared or re-sent. 

"We found that more than 90 percent of this content was the same as or visually similar to previously reported content," said the post.

"And copies of just six videos were responsible for more than half of the child exploitative content we reported in that time period. While this data indicates that the number of pieces of content does not equal the number of victims, and that the same content, potentially slightly altered, is being shared repeatedly, one victim of this horrible crime is one too many." 

The NCMEC told Insider in January that COVID-19 lockdowns were likely among the factors behind the overall increase in the amount of material reported to them in 2020. 

Vulnerable children were less able to get help, and there was a new trend of abuse being livestreamed on demand, said the NCMEC at the time. 

The 160 companies signed up to the NCMEC's child sexual abuse reporting mechanism voluntarily share the information, which is then used by law enforcement to investigate people committing the crimes. 

There are no laws in the US compelling platforms to proactively search out child sexual abuse material. 

Read the original article on Business Insider


from Business Insider https://ift.tt/3sub36b

No comments

Powered by Blogger.