Oddly, "bullshit" qualifies as a technical term in this context. The authors argue that chatgpt (and similar systems) emit bullshit.
They don't lie or hallucinate because they don't know or believe anything. It's all just text modeling.
The focus in this type of AI is to produce text that looks convincing, but it doesn't have any concept of truth/falsehood, fact or fiction.
When this is the way someone talks, we say that they're bullshitting us. So it is with chatgpt.
Plot twist. They used ChatGPT to write it -- The article is a confession
Let's hope it's extra bold. The last one was decidedly UNBOLD.
I wish I was as bold as these authors.
That's not bold. Just a limited vocabulary.
I'm so sick of people like you. "Oh, they may be right but they were a bit rude. So I'm going to pretend that they are wrong". Go fuck yourself if you have nothing useful to contribute. This is just a tactic trolls use in the internet to distract people, and people keep falling for it.
Plus if you actually read the article you would know that they are referencing an academic definition of bullshit. So go fuck yourself twice, you didn't even bother to read before declaring your incredible moral superiority. You are just a clown.
Let me say this slowly for you, so you can maybe understand. I wasn't criticizing the point in the original article. I was criticizing the OP who said it was "bold."
Now, run along and argue with someone more your speed. Try 5th graders.
Oddly, "bullshit" qualifies as a technical term in this context. The authors argue that chatgpt (and similar systems) emit bullshit.
They don't lie or hallucinate because they don't know or believe anything. It's all just text modeling.
The focus in this type of AI is to produce text that looks convincing, but it doesn't have any concept of truth/falsehood, fact or fiction.
When this is the way someone talks, we say that they're bullshitting us. So it is with chatgpt.