“Do not hallucinate”: Testers find prompts meant to keep Apple Intelligence on the rails
“Do not hallucinate”: Testers find prompts meant to keep Apple Intelligence on the rails
arstechnica.com “Do not hallucinate”: Testers find prompts meant to keep Apple Intelligence on the rails
Long lists of instructions show how Apple is trying to navigate AI pitfalls.
Long lists of instructions show how Apple is trying to navigate AI pitfalls.
You're viewing a single thread.
All comments
24
comments
Lmao do not hallucinate. I hope the overpaid apple "engineers" who came up with that one have reciepts showing how that helps :P
36 0 ReplyNah, 100% someone threw that in there to appease some clown who was saying 'come on just make it stop hallucinating, it's easy'
21 0 ReplyDoes Elon work at Apple now?
5 0 Reply
24
comments
Scroll to top