Logo

How effective will the Senate-passed bill, S. 4569, the Take It Down Act, which would criminalize the publication of non-consensual intimate imagery (NCII) be?

14.06.2025 08:20

How effective will the Senate-passed bill, S. 4569, the Take It Down Act, which would criminalize the publication of non-consensual intimate imagery (NCII) be?

The TAKE IT DOWN Act is really about what its letters say:

What the feds (Cruz is the sole sponsor, I see) could have done? If they probably would have focused more on the whole take-down notice provisions and the rights of privacy and publicity, it would probably be stronger as a whole.

It will likely be held unconstitutional.

Why are the bands Smashmouth and Nickelback often used as punchlines?

However, I'm a little bit more worried about how the vagueness is going to be used offensively to punish conduct that isn't really intended to be covered by this. I can think of a few strategic ways it could be used to threaten people for relatively benign conduct.

For example, it would become a criminal act to publish a photo like this—unredacted.

Eye-rolling, right?

Why do so many men on the internet try to compete with women, or try to "humble" and bash them? There's so many videos across my tiktok and YouTube of men claiming how they're wanting to get back at women and put them in thier place.

It just doesn't… do much in terms of the criminalization. Most people that experience problems with this, federal law enforcement is going to do nothing for them. The feds just don’t have the resources.

Probably close to zero. For two reasons:

“Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act”

What did Rama tell Sita about Kaliyug?

The phrase “non-consensual intimate imagery” is targeted toward using AI to create and publish lewd imagery of a real person. (Though there is a section in the act that really just covers intimate images that aren't deepfakes at all.)

To be clear, the bill is not about “non-consensual intimate imagery”. At least, not what you might think what that phrase means. It is a phrase defined in the bill—which I'll come back to in a moment.

So in effect, there's going to be a First Amendment problem on the scope of what is covered. It has the potential to criminalize otherwise protected speech, only because it’s on a computer network. And that’s going to be a big problem.

Is it true that all men want a woman who looks like an Instagram “model”?

You could publish the photo in a magazine. You could put it on a billboard. But you couldn't use an AI tool to generate the image—that would be a federal crime.

Don't get me wrong. Given what most of the bill covers, large sections of it will still be enforceable. But they're also a little vague. And that's going to cause some issues. People that create those kinds of images to harass or embarrass or humiliate or whatever? I don't feel bad for them in the slightest, and this law would probably punish them successfully.

So, simply publishing intimate photos is a federal crime. Under certain conditions. There's a takedown procedure there, and I don't really see that as being an issue. States have made publishing intimate images crimes and torts. But because computer networks fluidly crossed state boundaries, that gets complicated and messy to try to enforce to do what many victims really want—remove offending images. So, a federal takedown procedure is probably a good idea to facilitate that.

Does Donald Trump have low self-esteem?

But the part about deepfakes… it probably isn't going to fly. Not because the idea of deepfakes isn't troubling. Because of the way the law is actually written.