Skip Navigation

Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation

www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation

A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.

Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
229

You're viewing a single thread.

229 comments
  • What if I ask it to print the lyrics to The Song That Doesn't End? Is that still allowed?

    • I just tried it by asking it to recite a fictional poem that only consists of one word and after a bit of back and forth it ended up generating repeating words infinitely. It didn't seem to put out any training data though.

You've viewed 229 comments.