Definition
A failure mode in RAG systems where an LLM incorrectly identifies a retrieved document as the source of a specific claim, or generates a claim that cannot be supported by the provided context despite citing it. This often results from a trade-off between creative synthesis and strict citation adherence, where architectural fixes like 'Verification Chains' reduce error rates but increase inference latency.
Refers to incorrect source-to-claim mapping in AI, not the social psychology bias regarding human behavior.
"A scholarly footnote that points to a blank page or a completely different book than the one quoted."
- Hallucination(Parent Category)
- Faithfulness(Evaluation Metric)
- Grounding(Architectural Goal)
- NLI (Natural Language Inference)(Verification Component)
Conceptual Overview
A failure mode in RAG systems where an LLM incorrectly identifies a retrieved document as the source of a specific claim, or generates a claim that cannot be supported by the provided context despite citing it. This often results from a trade-off between creative synthesis and strict citation adherence, where architectural fixes like 'Verification Chains' reduce error rates but increase inference latency.
Disambiguation
Refers to incorrect source-to-claim mapping in AI, not the social psychology bias regarding human behavior.
Visual Analog
A scholarly footnote that points to a blank page or a completely different book than the one quoted.