The Candide Model: How Narratives Emerge Where Observations Meet Beliefs

Paul Van Eecke

Artificial Intelligence Laboratory, Vrije Universiteit Brussel

Lara Verheyen

Artificial Intelligence Laboratory, Vrije Universiteit Brussel

Tom Willaert

Brussels School of Governance, Vrije Universiteit Brussel & imec-SMIT, Vrije Universiteit Brussel

Katrien Beuls

Faculté d'informatique, Université de Namur

In this talk, we present the Candide model as a computational architecture for modelling human-like, narrative-based language understanding. Narratives are rooted in personal past experiences. They thereby constitute a perspective on the world that might motivate different interpretations of linguistic observations such as utterances, paragraphs, and texts. Current NLP techniques, however, are not adequate for modelling this personal, human-like, narrative-based language understanding, thereby failing to capture the diversity in perspectives that human language and cognition brings. Our Candidate model starts from the idea that narratives emerge when novel linguistic observations are interpreted in light of previously acquired beliefs. In our model, the belief systems and background knowledge of individual agents are modelled as dynamic systems that are updated as new observations are encountered. These systems are exposed to a logic reasoning engine that reveals the possible sources of divergent interpretations. Concretely, the Candide model consists of (i) a personal dynamic memory for storing the beliefs and background knowledge of an agent, (ii) a computational construction grammar that maps observations onto their underlying meaning representations and (iii) a Prolog-based reasoning engine that performs logical inference over the beliefs in the personal dynamic memory to come to a conclusion about the observation. Both the meanings underlying observations and the beliefs in the personal dynamic memory are represented using the same frame-based representation. This allows to dynamically update the personal dynamic memory of the agent with the information extracted from novel observations and to inform the language understanding process with the information from the personal dynamic memory. The computational construction grammar is operationalised using the Fluid Construction Grammar framework (van Trijp et al., 2022; Beuls and Van Eecke, 2023). The reasoning engine is completely interpretable, as it exposes the chain of inference operations that it performed to come to its conclusion. We demonstrate the model with a proof-of-concept implementation and a number of illustrative examples. This computational architecture is a step towards building transparent systems that model personal and dynamic narrative-based language understanding. It thereby responds to the growing interest in developing artificial agents that combine human-like language understanding with interpretable, value-aware and ethics-guided reasoning (see e.g., Steels, 2020; Montes and Sierra, 2022).

References

Katrien Beuls and Paul Van Eecke. 2023. Fluid Construction Grammar: State of the art and future outlook. In Proceedings of the First International Workshop on Construction Grammars and NLP (CxGs+NLP, GURT/SyntaxFest 2023), pages 41– 50, Washington, D.C., USA. Association for Computational Linguistics.

Nieves Montes and Carles Sierra. 2022. Synthesis and properties of optimally value-aligned normative systems. Journal of Artificial Intelligence Research, 74:1739–1774.

Luc Steels. 2020. Personal dynamic memories are necessary to deal with meaning and understanding in human-centric AI. In Proceedings of the First International Workshop on New Foundations for Human-Centered Artificial Intelligence (NeHuAI@ECAI), pages 11–16.

Remi van Trijp, Katrien Beuls, and Paul Van Eecke. 2022. The FCG editor: An innovative environment for engineering computational construction grammars. PLOS ONE, 17(6):e0269708.

CLIN33
The 33rd Meeting of Computational Linguistics in The Netherlands (CLIN 33)
UAntwerpen City Campus: Building R
Rodestraat 14, Antwerp, Belgium
22 September 2023
logo of Clips