Microsoft releases tool that can scan chats to detect pedophiles

Yolanda Curtis
January 12, 2020

Through collaboration with global partners and industry we are leading a worldwide effort to keep children safe from abuse.

If a party is deemed likely to be grooming Artemis will then flag the conversation and notify a human moderator to come and check it for further evaluation.

At Microsoft, we embrace a multi-stakeholder model to combatting online child exploitation that includes survivors and their advocates, government, tech companies, and civil society working together.

Microsoft has created an automated system to detect sexual predators trying to groom children online. It doesn't go into Artemis' technical details beyond how it is applied to historical chat text, a.k.a. chat logs, and rates each one according to their probability of containing child grooming.

"Microsoft says in their post, "Project Artemis" is a significant step forward, but it is by no means a panacea". It would also provide child protection experts with more information on how pedophiles operate online.

The launch of this technology represents the culmination of months of hard work by those committed to keeping our children safe online. Adult predators use built-in chat functions on popular video games and private messaging apps to groom children and solicit nude photos, sometimes by posing as kids themselves. Ministers agreed to collaborate on designing a set of voluntary principles that will ensure online platforms have the systems needed to stop the viewing and sharing of child sexual abuse material, the grooming of children online, and the livestreaming of child sexual abuse.

The team was led by Dartmouth College Computer Science Professor Hany Farid, who previously worked with Microsoft to build PhotoDNA, a tool that's been used by 150 companies and organizations to find and report images of child sexual exploitation. Project Artemis has been in development since November 2018, Microsoft hasn't been alone in the development either as The Meet Group, Roblox, Kik, Thorn and others have also given their assistance. It involves deceptive or indirect techniques to lure children into a sense of security only to abuse them later online.

Other reports by iNewsToday