A new report uncovered a “vast pedophile network” on the popular image-posting platform Instagram, a stark reminder of social media’s risks to children.
Instagram, owned by Facebook parent Meta, is relegated to messaging under photos, where The Wall Street Journal discovered “sick” codes for perversion were being used by pedophiles.
According to the explosive report, those implicated have been busted using innocuous emojis as code to communicate within the pedophile network.
The emojis included the following:
A “MAP” – is the acronym for “minor-attracted person.”
A Cheese pizza (CP is also the initials of “child pornography”)
A Reverse arrow next to a person’s age (so “Age 31” in a bio really has some sort of connection to a 13-year-old).
But this is just the surface; as the report outlines, the network runs deep.
In conjunction with researchers at Stanford University and the University of Massachusetts, Amherst, The Journal found that Instagram, through what appears to be malicious ignorance or perhaps something more sinister, allowed this community of pedophiles to thrive.
The investigation also discovered that all it takes is one visit to one of these pedophile profiles for the app’s algorithm to flood your feed with related suggestions.
In short, Instagram’s automation has allowed pedophiles to game its system.
The report also noted that pedophiles have long used the internet’s darl corners to satiate themselves; it was always a conscientious choice and something the pedophile would have to seek out.
Now it’s Instagram, and its algorithms have eliminated the conscientious part of it by automating and streamlining it all.
The Journal also added another sickening layer to the discovery:
“The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as ‘little slut for you.'”
The not-so-subtle hashtags can be easily found on Instagram.
Head of Stanford Internet Observatory and Meta’s chief security officer until 2018, Alex Stamos, said it was easy for people with “limited access” to dive deep into this network of pedophiles.
“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” Stamos told the Journal. “I hope the company reinvests in human investigators.”
Director of the UMass Rescue Lab, which researches and combats online child victimization, Brian Levine, warned that even if Instagram began restricting reach and access, the platform is almost a “gateway drug” to much worse corners of the internet.
“Instagram is an on-ramp to places on the internet where there’s more explicit child sexual abuse,” Levine said.
That makes Instagram’s dereliction of duty in removing the promotion of pedophilia a sore spot for him. When told that Meta and Instagram were working on safeguards against pedophilia, Levine responded.
“Pull the emergency brake,” he said. “Are the economic benefits worth the harm to these children?”
Meta said it was looking into the situation.
According to a company representative, a review of how Meta handled reports of child sex abuse found a number of issues, such as a “software glitch” that prevented child sex abuse report from reaching the right people, adding that the company staff wasn’t even correctly enforcing the rules if a child sex abuse report came through.
READ: Liberals Have Twitter MELTDOWN as Florida Enables Death Penalty for Pedophiles