Skip to main content

Verified by Psychology Today

Using the Power of Technology to Fight Online Human Trafficking

How cutting edge tools can mitigate global trafficking, exploitation, and abuse.

Key points

  • There has been a surge of online human trafficking, exploitation, and abuse in recent years.
  • Companies and organizations are teaming up to fight this with technology.
  • Current technologies can reduce crimes but future technologies will be able to help prevent them.

This post was co-written by Mellissa Withers and Kim Berg.

With the digitalization of the trafficking industry, obstacles in investigation efforts are becoming even more severe. Easy communication, photo-editing tools, disposable “burner” phones, code words, slang, and reused content make it difficult for law enforcement officers to connect the dots. Fortunately, there are digital technologies now that can help combat trafficking on a grand scale. Numerous artificial intelligence (AI), machine learning (ML), computer generated imagery (CGI), and big data analysis tools have been developed or are already in use within the anti-trafficking space, bringing some much needed reinforcements in the fight against online trafficking, abuse, and exploitation.

Help from the Greek Goddess of Hunting

In Greek mythology, Artemis is the goddess of hunting, and hunting for cases of online “grooming” is what Microsoft’s anti-grooming chat software, codenamed “Project Artemis,” was designed to do. Built in 2018 in collaboration with The Meet Group, Roblox, Kik and Thorn, Project Artemis evaluates and “rates” characteristics of conversations to assign a probability rating for possible grooming, which is then flagged and sent to human moderators for review. It recognizes specific words and speech patterns using natural language processing (NLP), providing a more accurate approach to evaluating nuanced conversations on chat platforms. This tool is a significant step forward in preventing grooming conversations from escalating on gaming and chat platforms and could be effectively leveraged on hotspots.

Using Digital “Fingerprints” to Detect CSAM

In the past 15 years, reported CSAM has increased 15,000%. In the past, technology companies relied exclusively on human content moderators to watch and assess reported material to catch Child Sexual Abuse Material (CSAM). Moderators often watch many hours of sexually explicit and abusive videos and photos, making difficult calls on the age of individuals involved and whether depicted behaviors are real or fake. However, Mindgeek (the company behind Pornhub) is estimated to only employ 80 moderators worldwide for 1.36 million hours of content uploaded to the site annually. In order to make content moderation more efficient and effective, PhotoDNA and Safer are two technologies with “hashing” and “matching” technology aimed at removing CSAM from the internet.

PhotoDNA was developed by Microsoft in 2009 and is being used by technology giants such as Bing, Gmail, Twitter, Facebook, and Reddit as well as by law enforcement. It works by taking a CSAM image or video and creating a digital “fingerprint” of it by compressing it and splitting it up into parts, ultimately turning it into a unique numerical code. This fingerprint, called a “signature” or “hash,” can be matched against a database of known CSAM posted online. This significantly reduces the time required for law enforcement or content moderators to review content because duplicates can be removed automatically, even if they have been altered, resized, changed, or reuploaded.

In addition, Safer is a software for CSAM moderation launched by Thorn in 2019. The software is used by companies such as Imgur, Flickr, GoDaddy, and Vimeo and has already removed 183,000 abuse images from the internet. Safer uses a similar process of “hashing” and “matching” as PhotoDNA.

The innate problem with the current approach of both PhotoDNA and Safer is that an image or video of CSAM must first be detected in order to run through the software, meaning someone must report it or a moderator must find it before it can begin to be removed. Adding machine learning CSAM “classifiers” will mean that these tools, on their own, will be able to detect new images and videos which are unknown to moderators.

Shining the Light on Trafficking

Spotlight is the leading victim identification tool in the anti-trafficking space and used by over 10,000 investigators. It has helped to solve 62,000 human trafficking cases, identify over 16,000 traffickers, reduce investigation time by 67%, and it identifies an average of 10 juvenile victims per day. Spotlight uses big data analytics to connect publicly available datasets from the online commercial sex marketplace to run machine learning algorithms on leads from law enforcement to identify and find trafficking victims. With the 150,000 escort ads posted daily online, Spotlight increases efficiency and effectiveness of sex trafficking investigations.

Cracking Down on Transit Routes

Traffik Analysis Hub is the first global hub of data across sectors and governments created by IBM and Stop the Traffik. This collaborative database was born out of the idea that “if you can track the money, you can track the people.” The goal is to identify global hotspots of trafficking to understand routes, transit points, types of trafficking, and how they change over time. The Traffik Analysis Hub currently has over 250 analysts registered from organizations like Western Union, Barclays, International Centre for Missing and Exploited Children, Standard Chartered, Mekong Club, Rotary, Salvation Army, Europol, and others. These partnerships have led to over 890,000 criminal incidents reported since launch and demonstrates a huge step forward for public/private partnerships within the anti-trafficking, exploitation, and abuse space.

“Sweetie” Is Bad News for Offenders

“Sweetie” is a computer-generated (CGI), virtual 10-year-old Filipina child run by Terre des Hommes, an international NGO of 10 national organizations, to fight webcam child sex tourism by engaging in chat conversations with perpetrators in chatrooms. Sweetie is able to replicate movements, facial expressions, and vocal tones of a real child and respond live to chats and messages in chat rooms. Though Sweetie 1.0 needed to be manually controlled by law enforcement officials, Sweetie 2.0 used an automated chat function to track responses, identify offenders, and deter offenders with warnings of legal consequences of engaging with real children. Sweetie 2.0 has helped expose 20,000 people trying to chat with her online, 1,000 of which were adults from 71 countries who were trying to pay her for webcam sex. Sweetie 3.0 will be able to track suspects across online platforms and social media sites outside of the chat room aiming to locate where children are being exploited. Moreover, the introduction of advanced artificial intelligence, deep learning, and machine learning provide even more opportunities for prevention.

An important factor to consider is that all of these technologies are focused within the “Grooming,” “Intervention” or “Reporting” stages of trafficking and exploitation protection. They are largely focused on removing CSAM after it is already posted, prosecuting offenders, or finding current victims. None of them currently are focused on the “Pre-Grooming” stage. This means that while these technologies are immensely helpful to mitigate trafficking, they do not prevent trafficking, exploitation, and abuse. Hopefully, future technologies will be able to go further and aid in preventative activities.

Although it seems like an endless uphill battle with traffickers and abusers constantly adapting and outpacing law enforcement and NGOs, these tools have aided thousands of victims in the past 10 years. This success points to the need for advanced technologies like AI, ML, CGI, and big data analysis combined with the partnership of public and private organizations to affect change. Social media companies like Facebook, Twitter, Instagram, Reddit, Discord, and Snapchat must join forces with anti-trafficking organizations and take responsibility for the extent of grooming, advertising, and CSAM present on their platforms. Thorn’s Technology Task Force, made up of 25 rotating technology giants such as Google, Facebook, Microsoft, and Amazon, is a prime example of the leadership necessary to bridge the gap between private and public responsibility. Together, we can combat child sex trafficking, exploitation, and abuse around the world.

Mellissa Withers is an associate professor of global health at the University of Southern California's Online Master of Public Health program. Kim Berg is a USC student in the World Bachelor of Business Program.

advertisement