“It’s time to give victims their day in court and the tools they need to fight back,” says Sen. Dick Durbin as the DEFIANCE Act heads to the House.
THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.
The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday. The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.
The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.
How close does the deep fake need to be to the original? What we saw with DALLE2 was that each person whose face was restricted made a hole in the latent space of all faces. After enough celebrities faces were restricted, there were so many latent space holes that the algorithm couldn't make faces at all since every producable face was a certain "distance" away from a restriction.
Sure, you can make lora training on unconsenting people illegal and also make particular prompts illegal, but there is enough raw data and vague prompts to mold a generation into something that looks like it was done the illegal way without breaking either of those two restrictions.
Doubt it, a reasonable person will generally be able to tell if you're obviously taking the piss with the law. Feel free to try it and let us know how you get on though.
But that is not what the bill says, the reasonable person is not evaluating my intent, it's evaluating if the video is "indistinguishable from an authentic visual depiction of the individual" which in this case it would be very distinguishable since the individual does not have said face tattoo.
Defamation is not parody. Fake porn of someone is absolutely defamation.
I can't legally make a "parody" of you but you're a pedophile.
Edit: Since there seems to be some confusion, I am not calling them a pedophile, I'm saying I can't make some sort of fake of them as a pedophile and call it a parody.
I'm literally doing the opposite of calling you a pedophile. I'm saying it would be illegal to call you a pedophile and claim it's a parody. That's not an excuse for defamation.
And I said that because I am assuming you are not a pedophile.
A depiction which is authentic might refer to provenance.
If someone authorises me to make a pornographic depiction of them, surely that's not illegal. It's authentic.
So it's not a question of whether the depiction appears to be AI generated, it's really about whether a reasonable person would conclude that the image is a depiction of a specific person.
That means tattoos, extra limbs, third books, et cetera won't side step this law.
There are billions of people. Find the right one, and a "reasonable person" could not tell the difference.
Image a law that said you cannot name your baby a name if someone else's name starts with the same letter. After 26 names, all future names would be illegal. The law essentially would make naming babies illegal.
The "alphabet" in this case is the distict visual depiction of all people. As long as the visual innumeration of "reasonable people" is small enough, it essentially makes any generation illegal. Conversely, if "reasonable people" granulated fine enough, it makes avoiding prosecution trivial by adding minor conflicting details.
"The right one" according to whom? There are two sides to a court case. The opposition can find all kinds of ways to show that person is not reasonable since they can't recognize a very good simulation of someone's face, just like they can show someone who is shortsighted didn't see the car crash like they said they did.
And what would be lost? I might be missing something, but what is the benefit of being able to make fake people faces that outweighs the damage it can do to people's lives and the chaos wrecked on society from deepfakes erc?
Making it harder for animators and illustrators to make a living outweighs the reality that every woman on earth now has to fear someone making revenge porn with their likeness?
Maybe if society was more reasonable and responsible with attitudes towards sexuality then deepfakes and sexual crimes would naturaly not be significant issues.
if everyone had a box that gave them the sexual gratification they needed, then they could go about the rest of their day without injecting sexual wants into normal activities and relationships.
Adverts and marketing might have to use facts instead of sex to sell products.