“Do not hallucinate”: Testers find prompts meant to keep Apple Intelligence on the rails
“Do not hallucinate”: Testers find prompts meant to keep Apple Intelligence on the rails
arstechnica.com “Do not hallucinate”: Testers find prompts meant to keep Apple Intelligence on the rails
Long lists of instructions show how Apple is trying to navigate AI pitfalls.

Long lists of instructions show how Apple is trying to navigate AI pitfalls.
You're viewing a single thread.
All comments
24
comments
Hmm, looks like any AI we develop is destined to go back to GOFAI with a bunch of IFs and THENs.
7 0 Reply
24
comments
Scroll to top