You Need to Read the Terms of Service for OpenAI and ChatGPT
For those using ChatGPT, if anything you post is used in a lawsuit against OpenAI, OpenAI can send you the bill for the court case (attorney fees and such) whether OpenAI wins or loses.
https://openai.com/policies/terms-of-use
7. Indemnification; Disclaimer of Warranties; Limitations on Liability
(a) Indemnity. You will defend, indemnify, and hold harmless us, our affiliates, and our personnel, from and against any claims, losses, and expenses (including attorneys’ fees) arising from or relating to your use of the Services, including your Content, products or services you develop or offer in connection with the Services, and your breach of these Terms or violation of applicable law.
I am not a lawyer and the implications are larger than this.
Do not post, share, trade, or otherwise make public any ChatGPT output from your sessions until you fact verify that data to the extent that you're willing to take legal responsibility for it. In this case, especially causing a lawsuit against OpenAI. Because when that happens, you will foot the bill.
Hi, I'm a lawyer. While I work in a different area of law and therefore can't speak too in depth about this with certainty, if their terms are as enforceable as the linked articles seem to indicate, then yes, this is good advice.
As always with the law, things may vary by jurisdiction. If you have specific questions, contact a lawyer in your area.
Basically just be careful if you like to post images/text taken straight from ChatGPT.
If you post anything that someone gets offended about and decides to sue ChatGPT (OpenAI) over it, they can turn around and bill you for those legal costs (whether they win the lawsuit or not).
Or if you post a screenshot that proves that you can get ChatGPT to write out the entire first chapter of some copyright protected book...
I've also seen people who like to "jailbreak" ChatGPT and then post things like tricking ChatGPT into giving instructions on how to make certain illegal devices and such. Again, just be careful and think if someone could sue the makers of ChatGPT and they include your social media post in the lawsuit, you have already agreed to pay their legal costs for that lawsuit.
I agree, it seems ridiculous, but according to the attorney in the video this would be enforceable, at least in the U.S.:
https://piped.video/fOTuIhOWFXU?t=330
I'm sure you could try to get your own attorney to try to fight back against OpenAI's attempt to bill you, but that's going to cost you as well.
That's gotta be more to cover their ass then to come after you. Unless you use it's generated text to sue the company I don't think they would ever try to sue their users or else everyone would stop using the platform and Microsoft would have a huge PR problem and their stock price would drop. It just doesn't logically make sense for them to do that, unless they were sued by you for the content produced by your inputs.
They produced a language model. It does nothing more than predict the next word. It will lie all the time, that's part of how it works. It makes stuff up from the input it gets.
If you post that stuff online and it contains lies about people and you didn't check it, you absolutely should be liable for that. I don't see a problem with that.
Right, but what about the case where you post something that doesn't contain lies at all?
What if ChatGPT outputs something that a certain former president gets offended by and he decides to sue OpenAI?
According to their ToS it doesn't matter if it's a "frivolous lawsuit". If OpenAI had to pay any attorney fees just to respond to some ridiculous lawsuit, they could still bill you for those costs.
I don't think it makes sense at that point at all.
Of course the vast majority of users would never have to worry about this, but it's still something to be aware of.
This isn't true in the least. Purchase a tool and look through the manual. Every section marked "danger", "warning", or "caution" was put in there because someone sued some company because the user or some bystander was hurt or injured.