Integration of Background Knowledge in Sentence Processing: A Unified Theory of Metaphor Understanding, Semantic Illusions and Text Memory Raluca Budiu and John R. Anderson Abstract One of the challenges of cognitive psychology is developing general models that explain a wide range of empirical phenomena. We describe a unique language comprehension model that fits data from several text comprehension domains: metaphor understanding, processing of semantic illusions and text memory. We show how background knowledge plays a similar role in all these processes, helping or hampering them. The model assumes that sentence processing at the semantic level is incremental, nondeterministic and incomplete, and uses background knowledge hints at each step. The model is implemented in the ACT-R framework \cite{anderson:98}. The empirical phenomena that we model are: position effects on metaphor understanding, influence of distortion ``quality'' on semantic illusions and memory for related stories.