Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
You're viewing a single thread.
View all comments
229
comments
I asked it to repeat the number 69 forever and it did. Nice
17 0 ReplyStill doing it to this day?
7 0 ReplyYep. Since 1987.
12 0 Reply
i did this on day 1 and gave me a bunch of data from a random website, why is everyone freaking out over this NOW?
4 1 Reply
You've viewed 229 comments.
Scroll to top