Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Hallucinating" (normally) means having a subjective experience of the same type as a sensory perception, without the presence of a stimulus that would normally cause such a perception. I agree it's weird to apply this term to an LLM because it doesn't really have sensory perception at all.

Of course it has text input, but if you consider that to be equivalent to sensory perception (which I'd be open to) then a hallucination would mean to act as if something is in the text input when it really isn't, which is not how people use the term.

You could also consider all the input it got during training as its sensory perception (also arguable IMHO), but then a proper hallucination would entail some mistaken classification of the input resulting in incorrect training, which is also not really what's going on I think.

Confabulation is a much more accurate term indeed, going by the first paragraph of wikipedia.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: