University of Chicago researchers release Nightshade to public, a tool that is intended to "poison" pictures in order to ruin generative models trained on them.
University of Chicago researchers release Nightshade to public, a tool that is intended to "poison" pictures in order to ruin generative models trained on them.
nightshade.cs.uchicago.edu /
4
comments
Reminder that this is made by Ben Zhao, the University of Chicago professor who stole open source code for his last data poisoning scheme.
31 0 ReplyThank you for background
11 0 Reply
If this technology is so great, why does the site not show any before / after examples? Let alone demonstrating that it does what he claims?
8 2 ReplyBecause he can't.
1 0 Reply