Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think calling it hallucination is because of our tendency to anthropomorphize things.

Humans hallucinate. Programs have bugs.



The point is that this isn't a bug.

It's inherent to how LLMs work and is expected although undesired behaviour.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: