“It’s time to give victims their day in court and the tools they need to fight back,” says Sen. Dick Durbin as the DEFIANCE Act heads to the House.
THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.
The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday. The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.
The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.
Things have already been moving towards nobody models. I think this will eventually have the consequence of nobodies becoming the new somebodies as this will result in a lot of very well developed nobodies and move the community into furthering their development instead of the deepfake stuff. You'll eventually be watching Hollywood quality feature films full of nobodies. There is some malicious potential with deep fakes, but the vast majority are simply people learning the tools. This will alter that learning target and the memes.
As always, I read the bill expecting to be deeply disappointed; but was pleasantly surprised with this one. It's not going to solve the issue, but I don't really know of anything they can do to solve it. My guess is this will mostly be effective at going after large scale abuses (such as websites dedicated to deepfake porn, or general purpose deepfake sites with no safeguards in place).
My first impressions on specific parts of the bill:
The bill is written as an amendment to the 2022 appropriations act. This isn't that strange, but I haven't actually cross-references that, so might be misunderstanding some subtlety.
The definition of digital forgery is broad in terms of the means. Basically anything done on a computer counts, not just AI. In contrast, it is narrow in the result, requiring that:
when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual.
There is a lot of objectionable material that is not covered by this. Personally, I would like to see a broader test, but can't think of any that I would be comfortable with
The depiction also needs to be relevant to interstate or foreign commerce. There hands are tied by the constitution on this one. Unless Wickard v Fillburn us overturned though, me producing a deepfake for personal use reduces my interstate porn consumption, so it qualifies. By explicitly incorporating the constitutional test, the law will survive any change made to what qualifies as interstate commerce.
The mens rea required is "person who knows or recklessly disregards that the identifiable individual has not consented to such disclosure" No complaints on this standard.
This is grounds for civil suits only; nothing criminal. Makes sense, as criminal would normally be a state issue and, as mentioned earlier, this seems mostly targeted at large scale operations, which can be prevented with enough civil litigation.
Max damage is:
$150k
Unless it can be linked to an actual or attempted sexual assult, stalking or harassment, in which case it increases to $250k
Or you can sue for actual damages (including any profits made as a result of the deepfake)
Plaintifs can use a pseudonym, and all personally identifiable information is to be redacted or filed under seal. Intimate images turned over in discovery remains in the custody of the court
10 year statute of limitations. Starting at when the plaintif could reasonably have learned about the images, or turns 18.
States remain free to create their own laws that are "at least as protective of the rights of a victim".
My guess is the "at least as protective" portion is there because a state suite would prevent a federal suit under this law, as there is an explicit bar on duplicative recovery, but I have not dug into the referenced law to see what that covers.
How close does the deep fake need to be to the original? What we saw with DALLE2 was that each person whose face was restricted made a hole in the latent space of all faces. After enough celebrities faces were restricted, there were so many latent space holes that the algorithm couldn't make faces at all since every producable face was a certain "distance" away from a restriction.
Sure, you can make lora training on unconsenting people illegal and also make particular prompts illegal, but there is enough raw data and vague prompts to mold a generation into something that looks like it was done the illegal way without breaking either of those two restrictions.
Ok, fist of all, politicians need to stop it with these acronyms for every law they want to pass. It’s getting ridiculous. Just give the damned law a regular-ass name. It doesn’t have to be all special and catchy-sounding damn.
Second, I’m really surprised to hear of anything passing the senate unanimously, other than a bill expressing the love of silly acronyms. And weak campaign finance laws.
Anyway, I’m glad that at least something is being done to address this, but I just know someone in the House is gonna fuck this up.
To improve rights to relief for individuals affected by non-consensual activities involving intimate digital forgeries, and for other purposes.
Congress finds that:
(1) Digital forgeries, often called deepfakes, are synthetic images and videos that look realistic. The technology to create digital forgeries is now ubiquitous and easy to use. Hundreds of apps are available that can quickly generate digital forgeries without the need for any technical expertise.
(2) Digital forgeries can be wholly fictitious but can also manipulate images of real people to depict sexually intimate conduct that did not occur. For example, some digital forgeries will paste the face of an individual onto the body of a real or fictitious individual who is nude or who is engaging in sexual activity. Another example is a photograph of an individual that is manipulated to digitally remove the clothing of the individual so that the person appears to be nude.
“(3) DIGITAL FORGERY.—
“(A) IN GENERAL.—The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, that, when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual.
It seems like the bill is being pitched as protecting women who have fake nudes passed around their school but the text of the bill seems more aimed at the Taylor swift case.
1 The bill only applies where there is an “intent to distribute”
2 The bill talks about damages being calculated based on the profit of the defendant
The bill also states that you can’t label the image as AI generated or rely on the context of publication to avoid running afoul of this law. That seems at odds with the 1st amendment.
Question from an outsider:
Do all bills in the states have to have a fancy acronym?
It looks like the senate is the first step, is that right? Next is the house? It's the opposite where I am.
Is this just "AI porn bill" because that's the most common way of doing it these days? I should expect the product is what's being sanctioned and not the method.