If “hallucination” just means “departure from the truth” , then there’s already the explanation that an LLM is just a next-token predictor.
If “hallucination” just means “departure from the truth” , then there’s already the explanation that an LLM is just a next-token predictor.