Humans hallucinate. Programs have bugs.
It's inherent to how LLMs work and is expected although undesired behaviour.
Humans hallucinate. Programs have bugs.