Sex offender banned from using AI tools in landmark UK case
Sex offender banned from using AI tools in landmark UK case

Sex offender banned from using AI tools in landmark UK case

A sex offender convicted of making more than 1,000 indecent images of children has been banned from using any “AI creating tools” for the next five years in the first known case of its kind.
Anthony Dover, 48, was ordered by a UK court “not to use, visit or access” artificial intelligence generation tools without the prior permission of police as a condition of a sexual harm prevention order imposed in February.
The ban prohibits him from using tools such as text-to-image generators, which can make lifelike pictures based on a written command, and “nudifying” websites used to make explicit “deepfakes”.
Dover, who was given a community order and £200 fine, has also been explicitly ordered not to use Stable Diffusion software, which has reportedly been exploited by paedophiles to create hyper-realistic child sexual abuse material, according to records from a sentencing hearing at Poole magistrates court.
As a UK citizen, I'm ashamed of my government.
I am firmly against child abusers, but AI images don't harm anyone and are a safe and harmless way for pedophiles to fulfil their urges, which they cannot control.
Where does the training data come from to create indecent images of children?
It doesn't need csam data for training, it just needs to know what a boob looks like, and what a child looks like. I run some sdxl-based models at home and I've observed it can be difficult to avoid more often than you'd think. There are keywords in porn that blend the lines across datasets ("teen", "petite", "young", "small" etc). The word "girl" in particular I've found that if you add that to basically any porn prompt gives you a small chance of inadvertently creating the undesirable. You have to be really careful and use words like "woman", "adult", etc instead to convince your image model not to make things that look like children. If you've ever wondered why internet-based porn generators are on super heavy guardrails, this is why.
The whole point of diffusion models is that you can generate new concepts using training data. Models trained on any nsfw images can combine those concepts with any of its non-nsfw concepts. Of course, that's not to say there isn't CSAM in any training data, because there objectively has been in the past, but there doesn't need to be any to generate it.
Ai is able to fill in the last field in a table like "Old / young" vs "Clothed / naked" when given three of the four fields.
Please reiterate your statement but instead using the "goose chase meme" format.
From a few months ago
https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse
I don't know what the right answer is, but we provide substitutes for drug addicts to help them overcome their addictions. Methadone and nicotine patches come to mind.
Is it completely inconceivable that a similar tool would help with harmful sexual desires?
Current mental help methods for pedophiles include acceptance of their desires as normal, just not something to act on IRL.
It does not prohibit any fictional materials including children, nor can it make someone uninterested in children.
By stripping away safe outlets, we may come at risk of these people increasingly turning to real CSAM, which is way more harmful.
This sort of problem-solving acumen is how HIV became so widespread in Africa. Have you considered instead trying competence?