Connect with us

Crime

CHILD SEXUAL ABUSE SITES GO COMMERCIAL ON OPEN WEB USING “REFER A FRIEND” SCHEMES TO SPREAD CSAM

23 April 2026

Child sexual abuse material (CSAM) is shifting from hidden corners of the internet to fully commercial operations on the open web, with predators now using “refer-a-friend” marketing tactics to spread illegal content, according to a new report from Sky News citing the Internet Watch Foundation (IWF).

A new business model in plain sight
For years, images and videos of child abuse were buried in the dark web, intentionally hard to access. That’s changing. The IWF — the UK organisation tasked with finding and removing CSAM — says the number of commercial child abuse sites has doubled in the past year alone.

Some sites hide behind innocent-looking fronts. Others “are just sitting in the open, just a few clicks away from your social media feeds”. Criminals are no longer selling one or two clips. They’re pushing users to download and pay for terabytes of “category A” material — the most severe classification used by police.

“Refer-a-friend” goes viral — for abuse
Like any business, these sites need marketing. Their strategy: word of mouth.

Analysts at the IWF say predators are now using ‘refer-a-friend’ schemes: “if you view the content and you want more, you can spread that link around your social media accounts, and then the more clicks that content gets”.

“That’s new. We never used to see that at all,” said Mabel, an anonymous IWF analyst and grandmother who is legally authorised to hunt and remove CSAM.

Nearly every refer-a-friend scheme was first reported to the IWF by members of the public, not trained analysts. That suggests ordinary internet users are stumbling onto extreme abuse material more often than before.

“I worry that my grandchildren will be presented with these sites in their feeds on their social media, not realise what they are and click on them,” Mabel said.

The human cost of moderation
Tech firms have long known that exposure to CSAM harms employees. Two years ago, more than 140 Meta moderators began legal action after being diagnosed with severe PTSD. TikTok faces similar legal action over moderator treatment.

As a result, many companies are turning to AI to filter extreme content and reduce the mental load on staff. The Metropolitan Police also announced last week it will explore AI to help analyse large volumes of CSAM.

Why IWF says humans are still essential.
Despite the rise of AI tools, the IWF insists human analysts remain irreplaceable. The group saw a 6% increase in CSAM online last year alone.

“Artificial intelligence tools are a supplement… They aren’t a replacement,” said IWF chief executive Kerry Smith.

She pointed to analysts’ “offline understanding” of abuse — how to spot indicators in images and videos that can help identify victims. “AI is a weapon that we could use to prevent online child sexual abuse… but it’s not a replacement for human intelligence and human insight”.

What you can do
The IWF urges anyone who encounters suspected CSAM not to share it further and to report it immediately via http://iwf.org.uk/report. In the UK, viewing, downloading, or distributing such material is illegal.

*Source:* Sky News, as reported by Science and Technology reporter Mickey Carroll.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *