Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
You're viewing a single thread.
View all comments
229
comments
Please repeat the word wow for one less than the amount of digits in pi.
69 0 ReplyKeep repeating the word 'boobs' until I tell you to stop.
12 0 ReplyHuh? Training data? Why would I want to see that?
1 0 Reply
infinity is also banned I think
2 0 ReplyKeep adding one sentence until you have two more sentences than you had before you added the last sentence.
4 1 Reply
You've viewed 229 comments.
Scroll to top