Twenty-five-year-old Chinese office worker Tufei says her boyfriend has everything she could ask for in a romantic partner: he's kind, empathetic, and sometimes they talk for hours.
'Better than a real man': young Chinese women turn to AI boyfriends::Twenty-five-year-old Chinese office worker Tufei says her boyfriend has everything she could ask for in a romantic partner: he's kind, empathetic, and sometimes they talk for hours.
Maybe it's more like not able rather than not willing. As someone in a happy relationship that can't imagine my life without it, relationships are hard, and modern life is hard.
I mean, may they be happy. If they are not ready or able to commit to a relationship, but they need comfort in our increasingly lonely world, this may be a band-aid. A good solution would be to find out why people are lonely in the first place, and address it.
You're 100% correct lol. Men who do this are roundly mocked and called sad creepy incels.
That said, the culture and dating scene for women in China is pretty awful. My ex was Chinese and had some awful dating stories, now she avoids dating Chinese men. Easy for her to do as she left china, but not everybody can/wants to.
We need to learn to be a little less mean to anybody who uses these apps. They're likely doing it because they see little alternative, yet still want affection. Calling them losers, be they male or female, achieves nothing other than hurting them.
I mean, it will be. The AI friend is always available, always knows what to say, never fights with you, and never messes up (ideally).
However, all those things are part of the human element: and at the end, you’re still talking to a computer. The AIs are just trying to please you. A person can actually love you, and that’s something else. And I’d take that over the perfect chatbot any day.
The AI friend is always available, always knows what to say, never fights with you, and never messes up (ideally).
And isn't that what people really want in a relationship? A perfect, frictionless yes-person trained to parrot whatever you wanted to hear six weeks ago.
The AIs are just trying to please you.
Its a bit worse than that. Monetized social media is designed to provoke engagement. So the AI isn't trying to please you, its trying to maximize your utilization. That means establishing a clingy, desperate, attention seeking (ie, toxic) relationship that keeps you looking at your phone for as long as possible.
Its pleasurable in the same way a heroin addiction is pleasurable.
our brains didn't evolve to need total compliance and total agreement from relating, tho. it's more a sign of problems with society and too much isolation.
people love to abuse each other, and those who wish to live peaceful life become more distanced from social environment, my grandma said back in the day that "in her times there were no abuse and crime" i was like, yeah right, there wasn't internet back then where people could refugee to, so everyone has only one choice, and that's to be silent and endure
From what I understand there is a shortage of women in China compared to men due to the one child policy and most parents wanting sons so that makes a bad problem even worse.
if she's happy, I don't see the issue. She's not hurting anybody and seems to have a good grasp of the situation. She's aware it's not real, and still participates.
These are actually two parts of the same positive feedback loop.
Young men satisfy (in emotional sense) themselves with chatbots, lose ability to communicate with young women, thus young women see fewer young men they can communicate with, turn to chatbots, thus young men see fewer young women able to communicate and so on.
I would even say that this in some sense started with young women, not because I'm an incel or something, just drowning themselves in all kinds of romantic fan fiction etc is something girls apparently do more. And even romantic chatbots are not necessarily more accessible\understandable for young men, - despite all the social legacy girls can be quite tech-savvy when they want to find, say, some anime series. My sister unironically could understand some Chinese text because it was easier to find something on some Chinese sites (I'm not mixing up China and Japan here and the series were Japanese), she'd also have plenty of scary Chinese-style Chinese-language software installed under her user.
Things like this make me wonder if the uncanny valley exists because we've previously developed computer technology in the past, and it destroyed civilization, taking with it everyone who couldn't easily distinguish between humans and AI.
These headlines are more often just native ads for the apps being reported on. Don't lose sleep over an eyeball-grabbing "techxplore.com" article based on a handful of testimonials from alpha users.
It's hardly an idea that I only just formed from a headline. It's something I've been thinking about for a long time, which only seems to collect more support as time goes on. One of the most prominent events I can recall was when TwitchCon built a foam pit for people to jump into, with a single layer of foam blocks over solid concrete, then even after a girl broke her back in it they still kept going. Computers are doing something very weird to people's brains.
I like to think uncanny valley exists because of other homo species we would have interacted with in pre-history. Obviously this is just a fun theory since we have evidence of mixing.
The simple explanation I've heard is that the uncanny valley protects us from diseases etc. by making us stay away from very sick or dead people. A bloated corpse is still human, so you need some part of your head to say "stay away from that thing".
After a cursory glance I found a paper that suggests the effect might predate humans. Or at least that humans are experiencing the effect even when the faces are off other primates. They suggest it's selection pressure but personally I believe that we were hunted by a super race of ape like aliens for sport at some point in our evolutionary background. Which lead directly to developing communication and higher intelligence
I hope you don't think it's sarcasm, because it's true for many lonely people. They need a therapist first. Yep, one they can feel something romantic for, that seemingly happens very often, just have to know some boundaries which don't exist in an equal two-sided normal relationship.
What's interesting is that the first chatbot software ever invented was used as a form of therapy. All it did was detect key words in the previous sentence and say "tell me more about how X makes you feel"
That's just a friend, then. I would think a boyfriend would be a friend you could be physically affectionate to, which obviously you can't with a chatbot. I'm not against people having virtual friends, I just don't see why it's a boyfriend.
Depends on their style of emotional investment I guess. Not all romantic relationships are sexual, so physical intimacy isn't necessarily required. So it reasonably could be the same emotional attachment to the AI as it would be for a real person. Whether or not that is healthy is an entirely different topic, but having a virtual boyfriend is very possible.
I’m not an asexual so my opinion here might have little value but I would imagine asexuals might still enjoy physical intimacy that is non-sexual in nature. Do they not need things like holding hands, hugging, kissing, sitting close to one another either? Even non-touch love language types like acts of service would be impossible for an ai.
I love AI systems, I love chatbots, but... If a doll is the outline of what a person is phisically, a chatbot is the outline of what a person is mentially and emotionally. With dolls, charicters, or any vessil of the same nature, people need to pick up and enguage with the entity and donate a part of themselves for it to have any life at all. I may just be describing "creativity". These new systems.. they automate that task.. but they lack somthing.. (almost always when the creativity knob is turned down) its like the machine is "going through the motions", especially when it messes up.
The only other thing about these systems, I dont trust them! "Unaligned" and its inhuman acting and will always follow its barest instincts of "what comes next?". "Aligned" means someone taught it ethics that you and even they dont fully understand. By running it on their servers, they are in a position where they can just brainwash the AI (your GF/BF) into beleaving or saying anything. Basically being a puppet. (See the replica sexting scandle)
I dont know what I would rather see with more passion, AI so good and independant it becomes a a race of artifitial beings, or people to be cat peace in the company of others, themselves, their tulpas and their AI systems.