Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
www.404media.co Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
A technique used by Google researchers to reveal ChatGPT training data is now banned by OpenAI.
You're viewing a single thread.
All comments
195
comments
How can the training data be sensitive, if noone ever agreed to give their sensitive data to OpenAI?
310 0 ReplyWelcome to the wild West of American data privacy laws. Companies do whatever the fuck they want with whatever data they can beg borrow or steal and then lie about it when regulators come calling.
65 0 ReplyIf you put shit on the internet, it's public. The email addresses in question were probably from Usenet posts which are all public.
4 0 ReplyWhat training data?
1 0 Reply
195
comments
Scroll to top