Jesus' Coming Back

The Least Congress Can Do Is Hold Tech Giants Accountable For Aiding Child Sex Traffickers

The recent film “Sound of Freedom” rightly brought attention to the horrific global child sex trafficking industry. But the problem is much nearer to home than most people realize. Social media platforms like Facebook, Instagram, Twitter, and TikTok provide traffickers and eager predators unprecedented access to children in the United States. And this sexual exploitation can happen without children ever leaving their parents’ homes.  

Online platforms have made it far easier for strangers to gain access to children. According to the Houston-based organization United Against Human Trafficking, 55 percent of victims of U.S. sex trafficking aged 7 to 11 are recruited through social media apps and websites. Social media makes it easy for perpetrators to find and communicate with their victims and also provides a convenient venue for sexual content to be advertised, bought, and sold.

With social media, sex trafficking is happening in plain sight. After the trafficker or lone-acting pedophile establishes a relationship of trust with the child, they will either try to meet in real life or exploit the child online, often through the same platform. This means children may continue to lead seemingly normal lives — going to school, playing sports — all while being coerced to engage in sexual activity online.  

To address child sexual exploitation online, policymakers must hold Big Tech companies responsible.  

For many years, Section 230 of the Communications Decency Act (and its overly expansive interpretation) has Big Tech sweeping immunity from any harm their platforms cause. They have been given immunity from both the harmful content they distribute, like child pornography and exploitative material, and, worse, they have largely been given immunity for harms from their own product design that helps facilitate sex trafficking.  

Despite their legal immunity, the evidence of Big Tech’s culpability in child exploitation speaks for itself. Instagram helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage sex content, according to investigations by The Wall Street Journal. Forbes found that TikTok live streams are full of viewers using the comments to urge young girls to perform acts that appear to toe the line of child pornography — rewarding those who oblige with TikTok gifts. It’s “the digital equivalent of going down the street to a strip club filled with 15-year-olds.” 

Facebook too has been used to sexually exploit children in at least 366 cases between January 2013 and December 2019. And Twitter, which has been rebranded as X, has had similar problems. While Twitter CEO Elon Musk has said that removing child sexual content from Twitter was “Priority #1,” the teams charged with monitoring for and subsequently removing such content have been reduced from 20 people to fewer than 10. 

In summary, these investigative reports and lawsuits, spanning Facebook, Instagram, Twitter, and TikTok, show these platforms to be at fault in two main ways.  

First, both Meta and Twitter admitted they have not done their due diligence in removing explicit content from their websites or reviewing reports in a timely manner. In some instances, child pornography victims have had to wait years until Twitter removed videos of their abuse. Similarly, Instagram allowed people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts using those terms to advertise child-sex material for sale.  

The real underlying issue here is that while Section 230 empowers platforms to remove such “obscene” content under its Good Samaritan provision, there is no legal penalty if they choose not to remove it.  

Second, these platforms have been designed with algorithms that extract user data to then recommend and promote content the user would like. In some examples, Instagram and TikTok recommended accounts of young girls to traffickers or linked accounts filled with child pornography as “suggested follows.” According to The Wall Street Journal, Instagram doesn’t merely host pedophilic activities, its algorithms promote them. 

It’s one thing for Section 230 to protect tech giants from liability for opinions that individuals post; it’s another thing for them to get off scot-free for facilitating child exploitation both by designing algorithms that help traffickers find and connect with their victims and by failing to remove child sexual abuse material. 

Congress agreed. As a first step toward reforming Section 230, Congress passed the Fight Online Sex Trafficking Act in 2017 (known as FOSTA-SESTA), which carves out an exception to Section 230 immunity to enforce sex trafficking laws. This new law helped secure a historic victory for sex trafficking victims over Big Tech in Texas last year.  

This is a good start, but Congress needs to address the underlying issues enabling child exploitation on social media. Currently, victims have no means of legal recourse to ensure that a platform removes images of their exploitation and abuse. And companies still have complete immunity for their exploitative product design. Although it has become a political football, Section 230 does need to be reformed so that Big Tech companies can be held liable for their facilitation and willful distribution of child sexual abuse material on their sites. 

One bill taking steps in the right direction is the bipartisan EARN IT Act that was voted out of the Senate Judiciary Committee for the third time this May (it has yet to pass a floor vote). The bill amends Section 230 to remove immunity from federal civil, state criminal, and state civil child sexual abuse material laws (similar to the FOSTA-SESTA model for trafficking laws). Internet companies would then be treated like everyone else when it comes to combating child sexual exploitation.  

For the sake of America’s children, we must hold Big Tech accountable for child exploitation. 


Emma Waters is a visiting fellow at Independent Women’s Forum (iwf.org). Clare Morell is a senior policy analyst at the Ethics and Public Policy Center, where she directs EPPC’s Technology and Human Flourishing Project.

The Federalist

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More