It's worse. So much worse. Now ChatGPT will have a human voice with simulated emotions that sounds eminently trustworthy and legitimately intelligent. The rest will follow quickly.
People will be far more convinced of lies being told by something that sounds like a human being sincere. People will also start believing it really is alive.
My point is that people will trust something that what sounds like it is being said sincerely by a living person more than they will regular text results a lot of the time because the "living person" sounds like they have emotions, which makes them sound like a member of our species, which makes them sound more trustworthy.
There's a reason why predators sometimes disguise themselves, or part of themselves, as their prey. The anglerfish wouldn't be as successful without that little light telling nearby fish "mate with me."