Skip Navigation

Posts
9
Comments
912
Joined
2 yr. ago

  • If it makes you feel better, I've heard good folks like Emily Bender of Stochastic Parrots fame suggest confabulation is a better term. "Hallucination" implies that LLMs have qualia and are accidentally sprinkling falsehoods over a true story. Confabulation better illustrates that it's producing a bullshit milkshake from its training data that can only be correct accidentally.

  • It's soooo bad. Could it be a cry for help?

  • I didn't think I could be easily surprised by these folks any more, but jeezus. They're investing billions of dollars for this?

  • We have got to bring back the PE exam for software engineering.

  • Chad move: doing jumping jacks\star jumps in a mine field

  • Movie script idea:

    Idiocracy reboot, but its about ai brainrot instead of eugenics.

  • Yeah, I reckon he could pay a bunch of dweebs to do racist reinforcement learning, but then the secret circle becomes so big that it's only a matter of time until there's a leak to the press. Plus, he really hates paying people.

  • So those safety pins that were a thing for a minute were AI Safety pins all along.

  • So I picked up Bender and Hanna's new book just now at the bookseller's and saw four other books dragging AI.

    Feeling very bullish on sneer futures.

  • Someone ought to tell him that they'll sue him using ChatGPT (in the most smug possible manner)