© 2025 KALW 91.7 FM Bay Area
KALW Public Media / 91.7 FM Bay Area
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

President Trump to sign bill addressing rise of revenge porn and deep fakes

STEVE INSKEEP, HOST:

In this country, President Trump holds a bill signing today. He is scheduled to sign the Take It Down Act into law. First Lady Melania Trump supported the bill aimed at revenge porn and deepfake images on the internet. This legislation includes harsher punishments for people who post images, and it forces online platforms to take down an image within 48 hours. In a time of bitter partisanship, this bill passed the House 409-2 and passed the Senate by unanimous consent, which means they didn't even formally vote. They just waved it through. So what does it do, and why is Nick Garcia concerned about it? He is the senior general counsel at the Washington, D.C. nonprofit Public Knowledge. It promotes freedom of expression, and he's in our studios to express himself. Good morning, sir.

NICK GARCIA: Hi. Thanks for having me on.

INSKEEP: Is it a little awkward to be opposed to legislation that almost everybody voted for?

GARCIA: Well, I wouldn't say it's awkward because this is an issue that we support strongly addressing. The distribution of nonconsensual intimate imagery is a real serious problem. And Public Knowledge and myself, we've always been strong advocates for addressing this problem.

INSKEEP: OK.

GARCIA: There's great legislation that does that. Unfortunately, this bill, it just doesn't strike the right balance between protecting people and also addressing that problem.

INSKEEP: OK. What we've been told it does is punish people who do this - revenge porn, deepfakes - and also makes internet providers take stuff down. What is wrong with that approach to this?

GARCIA: So it's a good approach, or it could be, at least. The problem with it is two things. One is that as the bill is currently drafted right now, there's an interpretation that it has a problem with people's privacy rights in terms of using private messaging services and other kinds of things that are encrypted, because as a result of the obligations that the bill creates, people may have a monitoring obligation to go break through that encryption in order to look at that content.

INSKEEP: Are you saying that this may apply to someone's private messages and not just to what they post on X or whatever?

GARCIA: Absolutely. So that's one of the big potential problems with the bill. The second big problem is that the notice and takedown system that it creates - which is something that we could be very for - doesn't have any of the guardrails that we know would be really important in order to protect free speech. So the problem there is that we know that other notice and takedown systems, like the Digital Millennium Copyright Act, they've been abused in the past to take down content inappropriately. Platforms, because they have to take things down in 48 hours under the Take It Down Act, they're going to have to be very aggressive about how they respond to any takedown notices, and that incentivizes them to take more things down, to not scrutinize things. And that, as a result, creates an overcensorship problem. And we know that this exists, and there are safeguards and guardrails that could be put on this piece of legislation, which civil society groups have been talking about, but none of that made it into the final bill.

INSKEEP: I feel that you're making some of the same objections that conservatives were making a few years ago when there were fact-checking mechanisms on social media and so forth. They said, basically, these corporations are preemptively taking down my speech, which should be perfectly fine or should be allowed. Do you feel you're making a very similar argument?

GARCIA: It's similar in some ways in that what we're always worried about when we're talking about content moderation online is about striking the balance between making sure that people have the opportunity to be heard. And in some sense, the problem of nonconsensual intimate imagery is that it drives people out of spaces, right?

INSKEEP: Sure.

GARCIA: So...

INSKEEP: So you don't want revenge porn. We should be clear on that. Yeah.

GARCIA: Absolutely. We need to do something to address this problem. The other problem, though, is that people like President Trump have said that they would love to use this bill in order to take down content that applies to them, even though it's very clearly not something that is revenge porn or deepfake pornography. It's what they want to use the bill for and the lack of safeguards that would allow them to do that that's the problem.

INSKEEP: So someone could broaden the definition of what should be taken down and abuse this, is what you're saying?

GARCIA: They don't even need to broaden the definition. It's just that you do a takedown notice, and the platforms will take it down.

INSKEEP: OK. Nick Garcia, senior general counsel at Public Knowledge. Thanks so much.

GARCIA: Thank you. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Steve Inskeep
Steve Inskeep is a host of NPR's Morning Edition, as well as NPR's morning news podcast Up First.