Blackburn: Snapchat is a Child Predator’s Dream

July 11, 2019

WASHINGTON, D.C. – Today, Senator Marsha Blackburn (R-Tenn.) spoke on the Senate floor about the letter she wrote to Snapchat on Monday urging CEO of Snap, Evan Spiegel, to take action to protect children from sexual predators and being exposed to explicit content while using the platform.

To watch Senator Blackburn’s speech, click below or HERE.

REMARKS AS PREPARED

Thank you, Madam President. 

In 2017, ICE agents arrested Francisco Javier Soledad on charges of producing child pornography using the popular social media app Snapchat. 

He assumed a variety of false identities, first a teenage boy, later an adult woman, and coerced at least 6 underage children into sending him sexually explicit pictures and videos. 

When one victim attempted to block Soledad’s account, he threatened to post the victim’s video on social media unless he received more videos. Imagine this happening to a frightened child.

It wasn’t an isolated incident. 

Matthew Murphy, of Massachusetts, was recently charged with sexual exploitation of children after he posed as a teenage girl to extort nude pictures from a middle-school aged boy, again via Snapchat. 

Federal investigators found evidence that Murphy had used his fake account to victimize other children in the area.

Before I continue, I want to un-couch some of the terminology commonly used to describe this horrific abuse: 

Pedophiles used a popular social media app to trick underage kids into creating and distributing homemade pornography. 

If we’re going to talk about these things, we shouldn’t pull our punches.

Snapchat by its very nature is a child predator’s dream. 

Its auto-delete feature allows individuals to ensure their pictures and videos erase themselves after only a few seconds. 

Its public location-sharing feature allows anyone—even underage children—to share their location in real time. 

If left in public mode, the “Snap Map” will reveal their location and their Snap video feed to complete strangers.

Even if an underage user hasn’t fallen prey, they’re still exposed to provocative and age-inappropriate material via the app’s “Discover” feature—recommendations generated by Snapchat itself, free from parental control or monitoring.

If you guessed that some of these channels specialize in porn, and suggestive content, you’d be right.

That’s why this week I sent a letter to Snap’s executives asking how they plan to fight this predatory behavior, and if they will give parents more control over the content their kids are exposed to.

To their credit, Snap’s executives have already responded to me, and it’s my hope they take these questions seriously. 

Because I absolutely will. 

At this point I want to make it clear that Snapchat is not the only offender. 

Last month my friend and colleague Senator Blumenthal joined me in a letter to YouTube asking why the video service’s recommendation mechanism continued to push content involving kids in suggestive or exploitative situations. 

By “suggestive or exploitative,” I mean “featuring partially clothed children, children in bathing suits, and children dressing and undressing themselves.”

YouTube’s recommendation system works by promoting similar videos to the one the user is already watching—which means that, by design, one vile video can lead to another, until the user is buried in smut that shouldn’t even exist.

The comments on these videos turned into a predator chat room, allowing users to share time stamps marking the most explicit moments. 

YouTube did disable comments in videos involving children, but their algorithms continue to push exploitative content via the recommendation feature.

The point of describing these things is not to throw individual companies and their tech under the bus; but it is crucial we understand that even at home or school, kids are vulnerable.

Even benign tech that doesn’t necessarily expose children to pornography can pose risks. 

In 2015 the Electronic Frontier Foundation filed a complaint with the Federal Trade Commission against Google, alleging that the tech giant’s “Google for Education” program was exploiting minors’ personal information, and potentially exposing it to third parties. 

The Chromebooks issued to students were loaded with Google Sync, allowing for the collection and storage of students’ browsing history and passwords.

Program Administrators were given complete access to a “cloud” system which allowed them to alter settings exposing student data—educational and personal, including physical location data—to Google’s development team, and to third party websites. 

One wrong click would expose that student’s “virtual you.” 

In Tuesday’s Judiciary Committee hearing, I asked the founder and CEO of Protect Young Eyes, Christopher McKenna, what steps he would take to protect kids from online predators. 

His answer was simple: give parents the option to control content access, and don’t hide the tools necessary to do so. 

I am not suggesting a takeover or a ban.

I am not suggesting we drop a regulatory anvil.

What I am suggesting is that we should not have to ask the makers of popular digital services to stop catering to child predators.

They can choose to recognize that predators lurk in every corner of society, and change the age ratings on their apps.

Choose to make parents aware of what a simple click or tap might unlock right before their child’s eyes.

Choose to stop this horrific cycle of dehumanization and exploitation before it starts.

I yield the floor.