Subscribe
Sen. Ted Cruz speaks during a briefing at the U.S. Capitol on May 9, 2024, in Washington, D.C.

Sen. Ted Cruz speaks during a briefing at the U.S. Capitol on May 9, 2024, in Washington, D.C. (Chip Somodevilla/Getty Images/TNS)

DALLAS — Sen. Ted Cruz hosted a field hearing Wednesday in Dallas to discuss deepfake revenge porn and his new legislation tackling the issue.

The bill, unveiled last week, would make it a federal crime to publish or threaten to publish nonconsensual intimate images, whether authentic or created artificially with AI or other computer tools. The TAKE IT DOWN Act would also require websites to remove the content within 48 hours of receiving a request from a victim. Cruz compared this requirement to existing copyright laws on Wednesday and said the mechanism of enforcement with the Federal Trade Commission would be similar.

Aledo High School student Elliston Berry and her mother Anna McAdams shared testimony at the University of North Texas at Dallas event of her experience last fall. A male classmate took images of Berry and her friends from Instagram and used an artificial intelligence program to make them appear to be nude photos, distributing them widely on Snapchat.

“I was 14 years old when I was violated all over social media, and I was just 14 years old when I feared my future was ruined,” Berry said. “My goal is to prevent any other student from undergoing this issue … and hopefully turn this horrible situation into something good.”

Advocates said the mandatory removal of content is a unique and essential aspect of this bill, because they fear tech companies won’t do it on their own. They cited examples of people making hundreds of requests and pointed to Berry’s photos being removed by Snapchat after Cruz’s office reached out to the company.

Snapchat did not have a comment on Berry’s case but pointed to policies prohibiting sexual exploitation and deepfakes on its platform.

“It’s imperative that tech plays a role here, and if they won’t do it voluntarily, they should be required to do so,” said Stefan Turkheimer, vice president of public policy for the nonprofit Rape, Abuse & Incest National Network. “This is a problem that is not just facilitated by them, but created by them and the proliferation of these images is not possible without them.”

Andrea Powell, director of The Reclaim Coalition, which works to end online Image-based sexual violence, said one of the top needs she sees among victims she works with is having the images taken down.

“When we think about traditional offline sexual assault, which is also horrific, there is a post-traumatic stress factor,” she said. “But you can’t be ‘post’ if you’re always worried about the future of your violence being exposed online.”

©2024 The Dallas Morning News.

Distributed by Tribune Content Agency, LLC.

Sign Up for Daily Headlines

Sign up to receive a daily email of today's top military news stories from Stars and Stripes and top news outlets from around the world.

Sign Up Now