Back to Glossary
Hallucination
ハルシネーション(ハルシネーション)
BeginnerCore Concepts
When an AI model generates information that sounds plausible but is factually incorrect or entirely made up.
Why It Matters
Hallucinations are a major reliability concern — users must verify AI outputs, especially for critical decisions.
Example in Practice
An AI chatbot confidently citing a research paper that doesn't actually exist.