Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use
Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use

Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use

Reminder that this is made by Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme.
Pardon my ignorance but how do you steal code if it's open source?
You don’t follow the license that it was distributed under.
Commonly, if you use open source code in your project and that code is under a license that requires your project to be open source if you do that, but then you keep yours closed source.
He took GPLv3 code, which is a copyleft license that requires you share your source code and license your project under the same terms as the code you used. You also can't distribute your project as a binary-only or proprietary software. When pressed, they only released the code for their front end, remaining in violation of GPLv3.
And as I said there, it is utterly hypocritical for him to sell snake oil to artists, allegedly to help them fight copyright violations, while committing actual copyright violations.
That TOS would be sus under any other situation.