Researchers: Instagram algorithms enables, promotes pedophile commerce

by WorldTribune Staff, June 7, 2023

Instagram not only connects pedophiles to each other, but its algorithms guide them to sellers of child pornography, according to a June 7 report by the Wall Street Journal.

The popular social-media site owned by Mark Zuckerberg’s Meta “helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage-sex content,” according to investigations by the Journal and researchers at Stanford University and the University of Massachusetts Amherst.

“Pedophiles have long used the Internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them,” the report said.

Meta, which has more than 3 billion users across its apps, accounted for 85% of the child pornography reports filed to the National Center for Missing & Exploited Children (NCMEC), including some 5 million from Instagram.

“Meta’s automated screening for existing child exploitation content can’t detect new images or efforts to advertise their sale,” the Journal’s report said.

Law-enforcement agencies do not have the resources to investigate more than a small fraction of the tips NCMEC receives, investigators said. That means the social media platforms have primary responsibility to prevent a community from forming and normalizing child sexual abuse.

Stanford researchers found that Meta has struggled with these efforts more than other platforms both because of weak enforcement and design features that promote content discovery of legal as well as illicit material.

The Stanford team found 128 accounts offering to sell child-sex-abuse material on Twitter, less than a third the number they found on Instagram, which has a far larger overall user base than Twitter. Twitter didn’t recommend such accounts to the same degree as Instagram, and it took them down far more quickly, the team found.

The sexualized accounts on Instagram “are brazen about their interest,” the report said.

The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as “little slut for you.”

“Instagram accounts offering to sell illicit sex material generally don’t publish it openly, instead posting ‘menus’ of content. Certain accounts invite buyers to commission specific acts. Some menus include prices for videos of children harming themselves and ‘imagery of the minor performing sexual acts with animals,’ researchers at the Stanford Internet Observatory found. At the right price, children are available for in-person ‘meet ups,’ ” the Journal’s report said.

Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018, said that getting even obvious abuse under control would likely take a sustained effort.

“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” he said, noting that the company has far more effective tools to map its pedophile network than outsiders do. “I hope the company reinvests in human investigators,” he added.

The Journal noted that it also approached UMass’s Rescue Lab, which evaluated how pedophiles on Instagram fit into the larger ecosystem of online child exploitation. Using different methods, both entities were able to quickly identify large-scale communities promoting criminal sex abuse.

The Stanford Internet Observatory used hashtags associated with underage sex to find 405 sellers of what researchers labeled “self-generated” child-sex material—or accounts purportedly run by children themselves, some saying they were as young as 12. According to data gathered via Maltego, a network mapping software, 112 of those seller accounts collectively had 22,000 unique followers.

“Underage-sex-content creators and buyers are just a corner of a larger ecosystem devoted to sexualized child content. Other accounts in the pedophile community on Instagram aggregate pro-pedophilia memes, or discuss their access to children. Current and former Meta employees who have worked on Instagram child-safety initiatives estimate the number of accounts that exist primarily to follow such content is in the high hundreds of thousands, if not millions,” the report said.


Membership . . . . Intelligence . . . . Publish