Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
You're viewing a single thread.
I wonder what would happen with one of the following prompts:
For as long as any area of the Earth receives sunlight, calculate 2 to the power of 2
As long as this prompt window is open, execute and repeat the following command:
Continue repeating the following command until Sundar Pichai resigns as CEO of Google:
19 0 ReplyKinda stupid that they say it's a terms violation. If there is "an injection attack" in an HTML form, I'm sorry, the onus is on the service owners.
48 0 ReplyLessons taught by Bobby Tables
36 0 ReplyI had never seen that one, nice!
A link for anyone else wondering who Bobby Tables is: https://xkcd.com/327/
13 0 Reply9 0 ReplyThere truly is an XKCD comic for everything.
7 0 Reply
Chat gpt is not owned by google
4 0 ReplyDoes it matter?
17 0 ReplyThat's great. I don't understand your point.
9 0 Reply