The new bill is the latest in a wave of AI-related legislation.
A bipartisan group of senators introduced a new bill to make it easier to authenticate and detect artificial intelligence-generated content and protect journalists and artists from having their work gobbled up by AI models without their permission.
The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act) would direct the National Institute of Standards and Technology (NIST) to create standards and guidelines that help prove the origin of content and detect synthetic content, like through watermarking. It also directs the agency to create security measures to prevent tampering and requires AI tools for creative or journalistic content to let users attach information about their origin and prohibit that information from being removed. Under the bill, such content also could not be used to train AI models.
Content owners, including broadcasters, artists, and newspapers, could sue companies they believe used their materials without permission or tampered with authentication markers. State attorneys general and the Federal Trade Commission could also enforce the bill, which its backers say prohibits anyone from “removing, disabling, or tampering with content provenance information” outside of an exception for some security research purposes.
(A copy of the bill is in he article, here is the important part imo:
Prohibits the use of “covered content” (digital representations of copyrighted works) with content provenance to either train an AI- /algorithm-based system or create synthetic content without the express, informed consent and adherence to the terms of use of such content, including compensation)
This is essentially regulatory capture. The article is very lax on calling it what it is.
A few things to consider:
Laws can't be applied retroactively, this would essentially close the door behind Openai, Google and Microsoft. Openai with sora in conjunction with the big Hollywood companies will be the only ones able to do proper video generation.
Individuals will not be getting paid, databrokers will.
They can easily pay pennies to a third world artist to build them a dataset copying a style. Styles are not copyrightable.
The open source scene is completely dead in the water and so is fine tuning for individuals.
Edit: This isn't entirely true, there is more leeway for non commercial models, see comments below.
AI isn't going away, all this does is force us and the economy into a subscription model.
Companies like Disney, Getty and Adobe reap everything.
In a perfect world, this bill would be aiming to make all models copyleft instead but sadly, no one is lobbying for that in Washington and money talks.
This is a brutally dystopian law. Forget the AI angle and turn on your brain.
Any information will get a label saying who owns it and what can be done with it. Tampering with these labels becomes a crime. This is the infrastructure for the complete control of the flow of all information.
They did it. They're passing the worst version of the AI law.
Thats the end for open source AI! If this passes, all AI will be closed source, and only from giant tech companies. Im sure they will find a way to steal your stuff "legally".
I don't like AI but I hate intellectual property. And the people that want to restrict AI don't seem to understand the implications that has. I am ok with copying as I think copyright is a load of bullocks. But they aren't even reproducing the content verbatim are they? They're 'taking inspiration' if you will, transforming it into something completely different. Seems like fair use to me. It's just that people hate AI, and hate the companies behind it, and don't get me wrong, rightfully so, but that shouldn't get us all to stop thinking critically about intellectual property laws.
If this passes, this would have the perverse effect of making China (and maybe to a lesser extent the Middle East) the leading suppliers of open source / open weight AI models...
If you put something on the Internet you are giving up ownership of it. This is reality and companies taking advantage of this for AI have already proven this is true.
You are not going to be able to put the cat back in the bag. The whole concept of ownership over art, ideas, and our very culture was always ridiculous.
It is past time to do away with the joke of the legal framework we call IP law. It is merely a tool for monied interests to extract more obscene profit from our culture at this point.
There is only one way forward and that is sweeping privacy protections. No more data collection, no more targeted advertising, no more dark patterns. The problem is corporations are not going to let that happen without a fight.
I posted this in a thread, but Im gonna make it a parent comment for those who support this bill.
Consider youtube poop, Im serious. Every clip in them is sourced from preexisting audio and video, and mixed or distorted in a comedic format. You could make an AI to make youtube poops using those same clips and other "poops" as training data. What it outputs might be of lower quality (less funny), but in a technical sense it would be made in an identical fashion. And, to the chagrin of Disney, Nintendo, and Viacom, these are considered legally distinct entities; because I dont watch Frying Nemo in place of Finding Nemo. So why would it be any different when an AI makes it?
Ladies and gentlemen of the jury, before you stands 8-year-old Billy Smith. He stands accused of training on copyrighted material. We actually have live video of him looking and reading books from the library. He he trained on the contents of over 100 books this year.
We ask you to enforce the maximum penalty and send his parents to prison.
Doesn't this infringe on fair use? e.g. if i'm making a parody of something and I mimic the original even by using a portion of the original's text word for word.
Everyone is so obsessed with having a monopoly over everything, it's not what is best for 8 billion people.
Introducing Chat-Stupid! It just like Chat-GPT but it wishes for any conversation with humans so it can legally learn....don't disclose company secrets or it will legally learn those too.
Obviously you can just use other software but PS is the main choice for image editing. What they need to do is put legislation in place and it will make the biggest players implement this form of drm.
A bipartisan group of senators introduced a new bill to make it easier to authenticate and detect artificial intelligence-generated content and protect journalists and artists from having their work gobbled up by AI models without their permission.
Content owners, including broadcasters, artists, and newspapers, could sue companies they believe used their materials without permission or tampered with authentication markers.
State attorneys general and the Federal Trade Commission could also enforce the bill, which its backers say prohibits anyone from “removing, disabling, or tampering with content provenance information” outside of an exception for some security research purposes.
Senate Majority Leader Chuck Schumer (D-NY) led an effort to create an AI roadmap for the chamber, but made clear that new laws would be worked out in individual committees.
“The capacity of AI to produce stunningly accurate digital representations of performers poses a real and present threat to the economic and reputational well-being and self-determination of our members,” SAG-AFTRA national executive director and chief negotiator Duncan Crabtree-Ireland said in a statement.
“We need a fully transparent and accountable supply chain for generative Artificial Intelligence and the content it creates in order to protect everyone’s basic right to control the use of their face, voice, and persona.”
The original article contains 384 words, the summary contains 203 words. Saved 47%. I'm a bot and I'm open source!
This is actually pretty cool for small artists, but how would it handle things like iFunny and such adding watermarks to shit they don't own in the first place?
Sure would be fun to expand things to include a section to not let normal people make art of copyrighted material or be an excuse to mess with fair use