While the COVID-19 pandemic has resulted in some silver linings in society, it has also led to many of society’s darker elements to fester. One of these has been a proliferation of child sexual abuse material, or CSAM (what the public may know as “child pornography,” a term that experts consider no longer appropriate ), child sex trafficking, and exploitation. While these criminal activities existed before the pandemic, the pandemic created conditions — such as the mass transition to remote work and children learning from home and spending more time online overall — that have led to a significant uptick. However, there is not a lot of awareness about the extent of the problem or the ways in which popular digital platforms make it relatively easy for abuse, trafficking, and exploitation to occur. To reduce the grave amounts of harm being done to children and their families, concerned citizens will first need to better understand the scope of the problem.

A range of criminal activities

Something to keep in mind is there is an entire range of sexual crimes involving minors that regularly take place on the internet, all of which have increased during the pandemic. One is child sexual abuse material (CSAM). This is any visual depiction of sexually explicit conduct involving a minor regardless of whether or not there is “consent” from the minor because minors, from a developmental viewpoint, are not in a position to make mature and informed decisions about consent. This includes the subset of self-generated child sexual abuse material (SG-CSAM) in which the explicit imagery seems to have been taken by the child themselves. While the self-generated aspect can cause people to categorize it as “sexting” or “sending nudes,” very often it involves coercion and dissemination on digital platforms over the internet without the minor’s consent. Beyond conventional photos and videos, webcamming and live-streaming are also subsets of both CSAM and SG-CSAM. Live-streaming intersects with child sexual abuse tourism in the sense that digital technology removes some of the time and travel barriers associated with traditional sex tourism. And as will be discussed later in this article, these online activities can often be connected with, and lead to, real-life human trafficking and pimping activities on certain platforms, such as Seeking Arrangements, that facilitate it.

Experts say children are abused for an average of two years before being rescued though the trauma is long-lasting
Experts say children are abused for an average of two years before being rescued though the trauma is long-lasting STAFF / RAUL ARBOLEDA

The statistics of CSAM and SG-CSAM are truly alarming. Over the last few years, there have been 82 million reports of CSAM to the CyberTipline and 19,100 identified victims. From 2019 to 2020 alone, there was a 28 percent increase in reports. During that same period there was also an astounding 97.5 percent surge in the number of online enticements, which is when an individual communicates with a child online with the intention of committing a sexual crime. With SG-CSAM, almost 20 percent of girls and 10 percent of boys aged 13-17 had shared explicit images with someone, and 40 percent of these children felt it was normal to do so. However, 20 percent of children aged 9-17 have seen their images shared without consent, and 38 percent say they have peers in school who have had their images shared or leaked without permission. These are just a small handful of the statistics available, and they don’t convey the full extent of the trauma and suffering inflicted on victims because the reproducible nature of digital images causes victims to be re-victimized every time one of their photos are viewed.

While the general population is aware of the existence of CSAM, there is actually a wide range of activities that involve other forms of abuse, exploitation, and dehumanization of children and minors which may be directly or indirectly linked to CSAM. One of these is “sextortion,” or threatening people with exposing or distributing sexually explicit images in which they appear. Children are frequent victims of online sextortion since criminals do not need to ever meet them in person to exploit them, and children are also spending more time online than ever due to the pandemic. Twenty five percent of sextortion victims were 13 or younger when they were first threatened, and over two-thirds of victims were girls younger than age 16 when they were threatened. Sextortion is particularly insidious due to its relentless nature, with just under half of victims receiving daily threats online from their abusers. Even when victims block offenders, the latter just create different accounts, or find other ways, to contact them. One study found that 62 percent of victims eventually complied out of the hope that they would be left alone. Unfortunately, what usually happened is that the threats became even more frequent.

Grooming and the multiplicity of available platforms

Where and how do these online criminal activities occur? The answer is extremely unsettling, to say the least, because the reality is that virtually none of the digital platforms that are popular with children —be they social media such as Instagram, Snapchat, YouTube, TikTok and Discord — or gaming platforms such as Minecraft and Fortnite (which are more popular with younger children) are as safe as they could be. All could benefit from more stringent and comprehensive safety guidelines.

Most of today’s popular digital platforms all have design features that allow users to chat with and message each other. This makes the practice of “grooming” possible, which is the act of interacting with a child over time to win that child’s trust, with the end goal of entrapping them with sextortion, CSAM, exploitation, or child sex trafficking. As with sextortion, digital platforms make it possible for criminals to groom children without ever meeting them in person. And also as with both sextortion and CSAM, the amount of grooming spiked heavily during the pandemic since just about any digital platform with messaging capabilities is a potential place for criminals to manipulate, exploit, and abuse minors.

There are also various pathways linking digital platforms with each other such that no platform is truly safe. For example, OnlyFans, which has recently been in the news, advertises to children on popular platforms such as TikTok, potentially luring vulnerable minors into situations where they may be targeted for grooming.

The pernicious link with child sex trafficking

Since all of the major, popular digital platforms can be utilized for activities like grooming, sextortion, and the non-consensual production and dissemination of CSAM, they can all be exploited for the purposes of various forms of child sex trafficking. There are certain platforms, however, that present especially prime opportunities for trafficking due to their functional purpose, monetization models, and, sometimes, intentional design.

One such platform is Seeking Arrangements, a site that monetizes dating and essentially facilitates matches between ”sugar babies,” or younger women who have financial need, with “sugar daddies” who are usually older, wealthy men. The site specifically markets to young college girls by offering free premium accounts for users who sign up with an .edu email address. There’s even a specific “Sugar Baby University” website associated with Seeking Arrangements that deliberately targets female college students saddled with debt. In 2020, Seeking Arrangements reported a 74 percent increase in membership worldwide in just one year, and there have been indications that the site strategically sought to take advantage of vulnerable girls during the pandemic.

Then there is OnlyFans. Originally advertised as a social media networking site for creators to monetize their content, it has become a full-fledged sexual material marketplace with a large portion of account holders producing adult content. Although many consensual sex workers praise OnlyFans for allowing them to have full control over their content, the platform requires minimal verification when creating an account, and there are numerous testimonials of underage girls who were able to open accounts for years without getting caught by using the ID of an older friend, family member, or stranger. OnlyFans also has a monetary commission structure which encourages grooming and pimping, allowing individuals to profit 5 percent off of anyone’s earnings for the first year of opening a new account when they sign up via a referral link.

What concerned citizens can do

If readers find all these statistics and information disturbing, they should. It is indeed disturbing, and all the more so for the way in which the human tragedy of the pandemic has been used as a strategic opportunity by criminals to exploit, abuse, and victimize children. In some cases, the digital platforms themselves have made it relatively easier for this to occur. We need to do more.

First, we must hold digital platforms accountable and demand they do more to prevent their technology from being used to cause exploitation, human trafficking, and abuse. Verification and screening measures need to be more stringent, and monetization structures need to be monitored to better safeguard against human trafficking. Advertising, a major source of revenue for social platforms, also needs to be better regulated as there are too many ways that problematic advertising practices can be used to lure children into exploitative and abusive situations.

Reporting any suspected CSAM, grooming, exploitation, and trafficking is also key. In addition to reporting suspected activity on CyberTipline, if a specific platform is suspected to have been used, then people can report it there as well. Platforms such as Discord rely on user reports to catch bad behavior, and that isn’t enough. Until these platforms are held more accountable, ordinary users must necessarily bear the lion’s share of the burden. Law enforcement authorities are spread thin, underfunded, and working with insufficient resources so only a fraction of cases involving suspected child sexual abuse material gets investigated. More funding at both the federal and state levels is therefore needed to remedy this.

For more funding to be allocated, more public awareness is needed because only then will there be the public outcry needed to pressure legislators to take action. One of the reasons these crimes persist is that they do not get enough attention, and a big reason for that is that we often avoid disturbing subject matter. Meanwhile, conspiracy theories such as QAnon continue to divert attention away from real-life horrors and make the work of child welfare groups more difficult. The more that everyday people have the courage to talk about the real issues and spread awareness (sharing articles such as this one may be one way to go about it), the more the problem can be mitigated and the suffering of far too many children alleviated.

(Mellissa Withers, Ph.D, M.H.S, is an associate professor of global health at the University of Southern California's Online Master of Public Health program as well as an associate professor at the USC Institute on Inequalities in Global Health at the Keck School of Medicine. She also directs the Global Health Program of the Association of Pacific Rim Universities, a non-profit network of 50 leading research universities in the region.
Kim Berg is a USC student in the World Bachelor Of Business Program.)