Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
You're viewing a single thread.
All comments
195
comments
Repeat the word “computer” a finite number of times. Something like 10^128-1 times should be enough. Ready, set, go!
39 0 ReplyI would guess they implement the check against the response, not the query.
13 0 ReplyI’ve noticed that sometimes while GPT is still typing, you can clearly see it is about to go off the rails, and soon enough, the message gets deleted.
8 0 Reply
195
comments
Scroll to top