We have to stop ignoring AI’s hallucination problem
We have to stop ignoring AI’s hallucination problem
AI might be cool, but it’s also a big fat liar.
You're viewing a single thread.
it's only going to get worse, especially as datasets deteriorate.
With things like reddit being overrun by AI, and also selling AI training data, i can only imagine what mess that's going to cause.
32 0 ReplyHallucinations, like depression, is a multifaceted issue. Training data is only a piece of it. Quantized models, overfitted training models rely on memory at the cost of obviously correct training data. Poorly structured Inferences can confuse a model.
Rest assured, this isn't just training data.
5 0 Replyyeah there's also this stuff as well, though i consider that to be a more technical challenge, rather than a hard limit.
2 0 Reply
I think you are spot on. I tend to think the problems may begin to outnumber the potentials.
4 0 Replyand we haven't even gotten into the problem of what happens when you have no more data to feed it, do you make more? That's an impossible task.
5 0 ReplyThere's already attempts to create synthetic data to train on
2 0 Reply