• Medientyp: E-Artikel
  • Titel: Monotonicity Reasoning in the Age of Neural Foundation Models
  • Beteiligte: Chen, Zeming; Gao, Qiyue
  • Erschienen: Springer Science and Business Media LLC, 2024
  • Erschienen in: Journal of Logic, Language and Information, 33 (2024) 1, Seite 49-68
  • Sprache: Englisch
  • DOI: 10.1007/s10849-023-09411-3
  • ISSN: 0925-8531; 1572-9583
  • Schlagwörter: Linguistics and Language ; Philosophy ; Computer Science (miscellaneous)
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: AbstractThe recent advance of large language models (LLMs) demonstrates that these large-scale foundation models achieve remarkable capabilities across a wide range of language tasks and domains. The success of the statistical learning approach challenges our understanding of traditional symbolic and logical reasoning. The first part of this paper summarizes several works concerning the progress of monotonicity reasoning through neural networks and deep learning. We demonstrate different methods for solving the monotonicity reasoning task using neural and symbolic approaches and also discuss their advantages and limitations. The second part of this paper focuses on analyzing the capability of large-scale general-purpose language models to reason with monotonicity.