Deepfake porn is destroying real lives in South Korea | CNN
Deepfake porn is destroying real lives in South Korea | CNN
Deepfake porn is destroying real lives in South Korea | CNN
Lets be perfectly honest here, especially for pictures this has been possible for at least 20 years.
What is ruining people's lives is the obsession a conservative society has with demonizing nudity and sex.
One small problem: accessibility.
It's like giving away machine guns with some free munitions in the hopes of getting "new enthusiasts who will use it at the shooting range", only for it to be end up in the hands of crazy people committing horrific violence.
55 years: https://www.imdb.com/title/tt0720250/
Lets not pretend that lowering the barrier to making fake nudes hasn't changed the situation either.
In a way it has improved it. It makes it more plausible to claim that any nudes that do show up of you are fake.
If there was no shame in being nude, then there wouldn't be a problem..... Imagine finding out that a guy you know has hundreds of portraits of you. Portraits you were never in! Absolutely not terrifying.
Prude shaming is exactly the same as slut shaming. Implicit in your comment is the notion that women ought to be perfectly fine with deepfake porn of themselves being created and viewed by anyone, but society has made them prudish through the demonization of nudity sex.
Women are people and have rights and feelings that should not be ignored simply to serve the male sex drive, and a great many women do not want to be deepfake porn stars. If you disagree then say so. Don't hide behind the notion that society has robbed women of their desire to serve your fantasy through the demonization of sex. Sexual liberation includes the right to not have or be involved in sexual activity if one chooses. Prude shaming, like yours, is designed to remove that right.
No wonder 4b took off in Korea.
Succinct, says the quiet part out loud, and it has me exploring my own assumptions.
Thanks for being a part of this community.
Society subjects people to a lot of things that people aren't "perfectly fine with" but only some of them have wider implications for reputation, career chances, ostracizing,... and that is the main problem with this kind of technology.
Your assumption that people need deepfake technology to fantasize about people sexually who have no interest in being on the receiving end of that is at best naive and at worst arguing in bad faith.
I agree in principle, society is demonizing nudity and sex. This has got to change. Society needs to change in order to fix this and many other issues related to sex and nudity.
As long as it affects a persons reputation and their standing, this is a problem. Any person can harm someone with this technology, and as a society we can not accept that.
Most people could not make a decent fake sex tape with any person in the world with low effort before. Now they can.
Should creating deepfakes for personal consumption be legal/illegal? Distribution is the real problem. The rest is fantasizing with tools more or less. Some people will understandably not like it if they find out other people fantasize about them, but that is close to thought crime. What is acceptable? Is a stickdrawing with names too much? What if I am really good at realistic drawings? What if I draw many images in a book and make a physical animation of ouf it? Is the limit anything outside my head? What if I draw a politician fellating another one and distribute it as art/satire?
The short term solution is to ban deepfakes, the long term is probably something else, but I am not sure what. There is not inherently any actual abuse in deepfakes, there is no actual sex either. So it's a reputational/honor and disgust thing. These things still matter a lot in societies, so we can't ignore it either.
Also people fixate and idolize them making it worse
Religion would like to have a word
It's just like piracy, when It becomes too easy it's when companies and government begins the crackdown.
Theoretically possible since Photoshop, but you had to be pretty fucking good at it, now even teens are making deepfakes of their teachers and classmates.
In a statement to CNN, Telegram said the company “has a zero-tolerance policy for illegal pornography” and uses “a combination of human moderation, AI and machine learning tools and reports from users and trusted organizations to combat illegal pornography and other abuses of the platform.”
They have machine learning algorithms for identifying nudity in pictures for decades now. Tech companies also have the best facial recognition software ever.
The company could combine both technologies to instantly stop uploads of content with the faces of previous victims.
The incel chuds out there redistributing and shaming the target is something I wish would be dealt with in the mainstream. It's disgusting that we're regressing as a society allowing for bullying to continue.
“Incel chuds” lol
This problem already existed since people got good enough with MS paint. Then got next level bigger with photoshop and is now simply on the next level once more with ai generation.
It's bad, and it's also not something new and not caused by AI but by humans misusing a tool. Again.
I think you mean, Em Spaint.
Aren't we bringing about an era where you can't trust what you see or hear, unless it comes from a source you trust?
Essentially aren't we just reverting back to 1800s where news came from newspapers of reputation, and hearsay came from elsewhere
It's worse. We are reverting back to the age of lügenpresse and hearsay comes in short-form video formats.
Many people simply do not care (or are even aware) if a source is trusted if the message aligns with their own bias or the message is presented as a new "fact". Trust is irrelevant, unfortunately.
Basically, except the newspapers of today no longer care about reputation. They only care about clicks, the bottom line, and speed. Accuracy is no longer a primary focus.
No because in the 1800s you could argue their was a thing called journalism. Now a days the debate between clicks and news means there isn't going to be trust worthy news because it's brought to you by Amazon AWS.
The fetish genre you know but don't want to know and yet someone is out there providing them. That's why normalcy is ok.
Like rule34+robot sex toys+VR is a huge mistake that should be noped if you think your child is heading that direction.
It's so disingenuous to pretend this is the same as editing a photo in MS Paint.
You missed the point. It's about this simply being the next step of an old and longstanding topic.
I'm quite simply fed up with people treating AI as the incorrect starting point and scapegoat of so many things they dislike that already existed.
Well, making deepfakes is certainly ruining Johnny Somali’s life
Couldn’t happen to a nicer person.