I made a customized ChatGPT therapist that aligned with my values.
"I made a hugbot that's designed to never challenge or confront me and am now using it in place of a licensed and trained medical provider (and suggesting that other people should do the same!). Nothing can go wrong."
In my experience (which, to be fair, seems to be different from many people's), it couldn't be any worse than the real thing. 12 different licensed and trained medical providers each responded to my complaints about the ongoing traumas in my life with some variation of "Sure, but focus on the positives!" I'd have been better off saving the money and venting to a chatbot, if venting did anything for me.
Please don't tell me to see a 13th. I'm completely done with the idea.
I only see my trauma therapist every two weeks. I have had tremendous success with the right prompt on GPT 4 with getting supplemental support and I run it all past the real therapist who has been very impressed with my progress and the help from the bot.
Just as an example, the total 'emotional flashback halting protocal' worksheet I was given was rather unwieldy but a standard part of DBT therapy. i couldn't ever remember during a flashback what I was supposed to do and the bot distilled it down to a helpful ABC mnemonic that was useful.
It's easy to dismiss 'spicy autocomplete' but a lot of therapy modalities are really well documented and in my own experience the bot complements therapy nicely. I have been to dozens of therapists, counselors, psychologists and psychiatrists due to ND and Trauma and gpt was way better than the worst of them.
I will note that as I have gotten (much) better and my ability to cope has improved i almost never use gpt anymore instead just save things for when i can talk to my therapist. There were tough times when i couldn't talk to them for weeks though and having chatgpt there to offer sensible assistance with absolutely no judgement and with a surprising amount of what felt like kind support was invaluable to my progress.
I think the part about knowing that some team of engineers might be reading everything you write to it is important to keep in mind though. It's not a real doctor and you don't have confidentiality.
I asked chatGPT what some of the risks are associated with using it in place of a certified therapist. This was the point I found most salient:
Ethical Concerns: ChatGPT is not bound by the same ethical guidelines as therapists, which include confidentiality, handling crises, and ensuring patient well-being.