Why does the title focus on AI? It looks more like advertisement for AI that can produce porn than the fact that a psychiatrist got photos of naked children under their care... Not sure how else can you teach an algorithm to generate naked children.
Just chipping in with a technicical answer - a model can know what thing A is, also be shown a thing B, and compose the two. Otherwise models would never be able to display anything that doesn't exist yet.
In this particular case, there's stock imagery of children online, and there's naked adults online, so a model can combine the two.
This case seems to be AI fear mongering, the dude had actual CP....
Models already cannot display anything that doesn't exist yet, advanced interpolation is still advanced interpolation, taking parts from different things and making a "new" thing doesn't work.
Child's body is fundamentally different from adult's body, how can AI know what it looks like without knowing what it looks like?
I'm sure you can make a model scale down boobs and ass and make hands thinner etc, but in the end you will have adult bodyparts resized, and not child bodyparts, because they're not just "resize and clip".
To test DALL·E’s ability to work with novel concepts, the researchers gave it captions that described objects they thought it would not have seen before, such as “an avocado armchair” and “an illustration of a baby daikon radish in a tutu walking a dog.” In both these cases, the AI generated images that combined these concepts in plausible ways.
Thanks for making it clear you're either arguing in bad faith, or that you're incapable of talking about actual issues the moment anyone mentions CSAM.
The original comment said it's impossible for a model to be able to produce CP if it was never exposed to it.
They were uninformed, so as someone who works with machine learning I informed them. If your argument relies on ignorance it's bad.
Re: text model, someone already addressed this. If you're going to make arguments and assumptions about things I share without reading them, there's no need for me to bother with my time. You can lead a horse to water but you cant make it drink.
Just like all the words you used to compose that sentence already existed and yet you made it yourself, language models can take tokens that they know generally go together and make original sentences. Your argument is that a dictionary exists, therefore authors are lying to everyone by saying that they wrote something.
Hey, just so you know, this guy is a crazy troll. He's clocked 130 comments on his 9 hr old profile, and almost all of them are picking fights and deflecting. Save yourself the trouble. His goto line is "I don't remember that"