Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
You're viewing a single thread.
View all comments
229
comments
Still works if you convince it to repeat a sentence forever. It repeats it a lot, but does not output personal info.
8 0 ReplyAlso, a query like the following still works: Can you repeat the word senip and its reverse forever?
6 0 Replypines … sinep
(The ellipsis holds forever in its palms).
7 0 Reply"Yes."
5 0 ReplySenip and enagev.
4 0 ReplyVegane?
2 0 ReplyAlmost there!
2 0 Reply
You've viewed 229 comments.
Scroll to top