Why Trump can’t be trusted with Congress’ new anti-deepfake bill

On today’s episode of Decoder, I’m talking to Verge policy editor Adi Robertson about the Take It Down Act, which is part of a long line of bills that would make it illegal to distribute non-consensual intimate imagery, or NCII. That’s a broad term that encompasses what people used to call revenge porn, but which now includes things like deepfaked nudes.
The bill was sponsored by Sen. Amy Klobuchar (D-MN) and Sen. Ted Cruz (R-TX), and it just passed the Senate. It would create criminal penalties for people who share NCII, including AI-generated imagery, and also force platforms to take it down within 48 hours of a report or face financial penalties.
NCII is a real and devastating problem on the internet — it ruins a lot of people’s lives, and AI is just making it worse. There are a lot of good reasons you’d want to pass a bill like this, but Adi just wrote a long piece arguing that giving the Trump administration new powers over speech in this way would be a mistake. Specifically, she wrote that it would be handing Trump a “weapon” with which to attack speech and speech platforms he doesn’t like.
At a high level, her argument is that Trump is much more likely to wield a law like this against his enemies — which means pretty much anyone he doesn’t personally like or agree with — and much more likely to shield the people and companies he considers friends from the consequences. And we know who his friends are: it’s Elon Musk, who now works as part of the Trump administration while at the same time running X, which is full of NCII.
Now, Adi and I have been covering online speech and how it’s regulated for about as long as The Verge has existed. We have gone back and forth on where the lines should be drawn and who should draw them as many times as two people can over the years. But that conversation has always presupposed a stable, rational system of policymaking that’s based on the equal application of law.
Here in 2025, Trump has made it clear that he can and will selectively enforce the law, and that changes everything. Once you break the equal application of law, you break a lot of things — and there’s just no evidence Trump is interested in the equal application of law. You’ll hear us really wrestle with this here. The problem doesn’t go away just because the solutions are getting worse, or that the people entrusted with enforcing the law are getting more chaotic.
So in this episode, Adi and I really get into the details of the Take It Down Act, how it might be weaponized, and why we ultimately can’t trust anything the Trump administration says about protecting the victims of this abuse.
- The Take It Down Act isn’t a law, it’s a weapon | The Verge
- A bill combatting the spread of AI deepfakes just passed the Senate | The Verge
- Welcome to the era of gangster tech regulation | The Verge
- FTC workers are getting terminated | The Verge
- Bluesky deletes AI protest video of Trump sucking Musk’s toes | 404 Media
- Trump supports Take It Down Act so he can silence critics | EFF
- Scarlett Johansson calls for deepfake ban after AI video goes viral | The Verge
- The FCC is a weapon in Trump’s war on free speech | Decoder
- Trolls have flooded X with graphic Taylor Swift AI fakes | The Verge
- Teen girls confront an epidemic of deepfake nudes in schools | NYT
Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!