Technology News: In India’s first, Kerala police armed with AI hunt down the worst sexual predators on darkweb

Technology News: In India’s first, Kerala police armed with AI hunt down the worst sexual predators on darkweb

“CHEEZE PIZZA”, the encrypted forum’s name read. Sometime in 2025, the Kerala Police’s Counter Child Sexual Exploitation (CCSE) team stumbled upon the forum during routine cyber patrolling on the dark web. It was one such forum where many traded links to child sexual abuse material (CSAM) and directed others to private Telegram groups where the content could be bought and accessed.

While digging up “CHEEZE PIZZA”, investigators spotted a visual of a child who “resembled a Keralite”. The user had posted multiple photos and videos of the same child. This made the police dig deeper. Months of digital tracking followed. Police befriended the suspect online and gained access to the Facebook and Instagram credentials. The trail led them to a woman in Bengaluru, but the account’s location showed Thiruvananthapuram. The clue came from the woman’s Facebook friend list. One of her Facebook friends was from Thiruvananthapuram.

Police traced the profile and then examined the display pictures the person had on Telegram groups used for CSAM circulation. A house seen in one of the images matched the Thiruvananthapuram location they had identified. When officers inspected the house, they found the child. It was the suspect’s niece.

She was rescued and counselled. The perpetrator was arrested and booked.

While one part of the story was the persistence of the Kerala Police officers, it was a breakthrough which was made possible through an artificial intelligence (AI) tool.

The tool worked through enormous volumes of data scattered across platforms to narrow down to the perpetrator. Without it, analysing them manually would have taken much longer.

To combat the surge in online child sexual abuse cases, Kerala Police’s Counter Child Sexual Exploitation Centre, for the first time in India, integrated AI for its investigations. It is an AI-driven investigation platform, called Katalyst, which was being used on a pilot basis. Now, the Kerala Police is working on integrating its latest version of the tool developed by New Zealand-based Kindred Tech and is powered by MongoDB’s data infrastructure.

“With a small team, you cannot manually go through all of it,” said Ankit Asokan, SP Cyber Crime, Kerala Police, explaining why Katalyst is crucial in handling vast volumes of digital evidence. “This problem was amplified by technology. We must use technology to fight back,” he added. While a stockpile of digital evidence is a challenge, another challenge in CSAM cases is that “there is no victim coming to us,” SP Asokan said.

Katalyst has given the Kerala Police the much-needed technological edge to help identify victims and stop the abuse of minors, which demands going through a massive amount of data and proactive intervention. During the 18-month pilot project (starting 2024) between Kerala Police and Kindred Tech, investigators reported 96 arrests. This led to the safeguarding of 20 children and 18 international referrals, according to official figures by Kindred Tech and MongoDB.

“Our mission is to empower investigators with tools that surface insights faster and ensure every child’s case receives the urgent attention it requires,” Auckland-based Bree Atkinson, the CEO of Kindred Tech tells India Today Digital. Atkinson added that Katalyst enables officers to spend “less time buried in data and more time safeguarding children”.

But first, let’s take a step back and understand what CSAM is, and what exactly happens in such cases? Also, we will look at how deep and dangerous this menace is. and how the AI integration has given a technological edge to the Kerala Police in cracking CSAM cases in the state.

WHAT IS CSAM? HOW BIG IS THE MENACE?

Child sexual abuse material, referred to as CSAM, comprises images, videos or digital content that depict the sexual abuse or exploitation of a child.

“Earlier, we called it online child sexual abuse. Now it is broader. It is technology-facilitated child sexual abuse,” Ankit Asokan, SP Cyber Crime, Kerala Police, told India Today Digital.

“There is a child. There is technology. And there are multiple ways content is created,” he added.

Traditionally, abuse material fell into two categories; content coerced or extorted from children, and abuse recorded by someone in a position of power or authority. Now, a third category is emerging. It is AI-generated or digitally manipulated content using identifiable features of real children.

Once created, the material moves rapidly. “It becomes an ecosystem,” Asokan says, describing a chain where content is transmitted, sold, consumed and re-shared across platforms.

The scale of CSAM in India is staggering. In 2024, Indian agencies received 2.2 million cyber-tip reports related to online child sexual abuse, according to the National Centre for Missing and Exploited Children (NCMEC), a US-based NGO. In 2024, the Internet Watch Foundation recorded a record 2,91,273 reports of child sexual abuse material in the EU, according to the Ministry of Women and Child Development.

“The menace is global,” SP Asokan stressed. The surge is linked to more children getting online, the rise of messaging apps, gaming platforms and the rapid spread of synthetic media. “In a normal crime, there is a victim pushing for justice. In these cases, there is no victim coming to us,” Asokan said.

In 2021 alone, the US’s National Centre for Missing and Exploited Children received 29.3 million reports containing 39.9 million images and 44.8 million videos, and alerted law enforcement to over 4,260 potential new child victims, according to a written submission by the ECPAT International (a global network working to end child sexual exploitation), to United Nations Office on Drugs and Crime (UNODC).

This scale, however, was over five years ago, and the numbers have only risen today.

HOW IS KERALA POLICE COMBATING CSAM WITH AI?

Kerala Police’s response has been anchored in technology partnerships and specialised training.

In 2024, the Counter Child Sexual Exploitation Centre (CCSE) began an 18-month pilot of Katalyst, an AI-driven digital investigation platform developed by Kindred Tech, a non-profit building tools for law enforcement.

During the pilot, the police launched the P-Hunt Operation and arrested 96 people. In February, four people were arrested, and 51 digital devices seized in Ernakulam district as part of Operation P-Hunt. Police conducted raids at 82 locations. At least 36 of them were in the city and 46 were in rural areas, The Hindu newspaper reported.

The initiative of Kerala Police has also received the state-level Technology Sabha Excellence Award.

Kerala Police have also worked with technologists from New Zealand through Cyber Dome initiatives and hackathons, building tools collaboratively rather than outsourcing investigations.

“Our mission is to empower investigators and law enforcement agencies with tools that surface insights faster and ensure every child’s case receives the urgent attention it requires… MongoDB helps us combine the existing data with the power of AI, so Katalyst can surface critical insights faster, allowing Indian investigators to spend less time buried in data and more time safeguarding children,” Auckland-based Peter Pilley, the founder of Pathfinder Labs, told India Today Digital.

“We are not outsourcing investigations,” Asokan clarified, adding, “These are police personnel trained in technology.”

HOW ARE CHILD SEX ABUSE MATERIALS GENERATED?

The mechanics of sexual abuse, especially in the context of children, have evolved. Coercion now often begins in gaming chats or social media direct messages. Offenders groom children, solicit images and then use threats to extract more. In parallel, AI tools can generate hyperrealistic images using fragments of a child’s face scraped from social media, explained SP Asokan.

“This is consumed by people. This is transmitted and sold by people. And again, people also generate it online,” Asokan told India Today Digital.

“And then there are categories like non-consensual image sharing, which is sometimes referred to as revenge porn, though we would not advocate using that term, where material or media is taken from somebody else and shared by a third party. One of the more harmful categories right now is sextortion, and specifically against young boys. And that’s happening on a number of platforms now,” Bree Atkinson, the CEO of Kindred Tech, told India Today Digital.

Unlike conventional crimes, CSAM investigations are largely proactive. “Everything runs in the background,” SP Asokan said. The child might not know the material exists. The consumer and creator could be individuals in different jurisdictions.

“There is no area on the internet or off it that is kind of a place where it’s not happening. One of the things that most astonishes people is that they believe it’s in a hidden area, out of sight, in a place so specific and secret that they’ll never see it. But it’s on the internet. It’s in the same places they are active right now,” said Peter Pilley, who, before getting into Pathfinder Labs, investigated child sexual abuse cases in New Zealand for 15 years.

HOW KERALA COPS ARE USING AI TO CRACK CSAM CASES

Manual forensic review of evidence in CSAM investigations often means combing through 200-250 (or more) GB of data per case.

Seized devices, cloud storage and other digital trails have to be keenly combed. Cyber-tip (digital alerts) runs into 35 pages of metadata (data about data), extensive chat logs, platform identifiers and file hashes, making the manual investigation process slow and resource-intensive. In 2024 alone, Indian agencies received 2.2 million cyber-tip reports from the National Centre for Missing and Exploited Children, each averaging 35 pages of dense digital evidence.

This is where AI helps the Kerala Police triage this vast pool of data.

This allows crucial leads to surface quickly and enables law enforcement to identify victims and perpetrators and intervene in ongoing abuse sooner.

Katalyst automates ingestion and prioritisation of cyber-tip reports, offers advanced case management tools, a secure media library, AI-driven categorisation of sensitive media and automated forensic support to the police. “Katalyst stores AI findings along with the raw data and helps investigators search for similar patterns to spot repeated abuse,” according to Peter Pilley of the Auckland-based Pathfinder Labs.

One of the most powerful techniques involves object tracing. A single abuse series might span years, filmed against the same backdrop. That’s where AI platforms like Katalyst can isolate a pattern. A design on a wall, a lamp, a wall paint colour, a pattern on a curtain, are taken as samples from the rest of the image. That fragment can then be shared publicly for crowdsourced information gathering, without exposing explicit content.

Moreover, suppose “a child is in someone’s guardianship and is being captively abused… There is a curtain in the same room, and there are like 100 videos which have come out in one year. If investigators notice that it’s the same curtain, they can try to trace it. If you can find out where this curtain is, or who made it, or in which region of the nation is that design or block print from, then the police can study the pattern,” Asokan told India Today Digital. “We can triage it,” he said.

Asokan, using this example of how Katalyst comes in handy, said, “It is definitely first of a kind [tool].”

Similarly, like in the Thiruvananthapuram case, where the perpetrator exploited his niece, AI-assisted analysis of social media metadata and image comparison narrowed the suspect’s location and identity.

The Kerala Police’s adoption of Katalyst is all about the use of technology to fight technology-induced crimes. It is also about rescuing children faster and holding offenders accountable in an ecosystem where victims often cannot come forward themselves.

“This problem was amplified by technology. We must use technology to fight back,” Asokan said, adding, “There is no one who will come and complain. So we have to act.”

– Ends

Published By:

Sushim Mukul

Published On:

Apr 26, 2026 13:15 IST

Tune In

Leave a Reply

Your email address will not be published. Required fields are marked *