‘Reasoning’ AI is LYING to you! — or maybe it’s just hallucinating again
A “reasoning AI” is a large language model — a chatbot — that gives you a list of steps it took to reach a conclusion. Or a list of steps it says it took. LLMs don’t know what a fact is — they just…
Video version
You're viewing a single thread.
Why would the steps be literal when everything else is bullshit? Obviously the reasoning steps are AI slop too.
no no it's LYING
The paper clipping is nigh! Repent Harlequins