Structural generalization in COGS: Supertagging is (almost) all you need - Traitement du Langage Parlé
Communication Dans Un Congrès Année : 2023

Structural generalization in COGS: Supertagging is (almost) all you need

Alban Petit
  • Fonction : Auteur
  • PersonId : 1144131
Caio Corro
François Yvon

Résumé

In many Natural Language Processing applications, neural networks have been found to fail to generalize on out-of-distribution examples. In particular, several recent semantic parsing datasets have put forward important limitations of neural networks in cases where compositional generalization is required. In this work, we extend a neural graph-based semantic parsing framework in several ways to alleviate this issue. Notably, we propose: (1) the introduction of a supertagging step with valency constraints, expressed as an integer linear program; (2) a reduction of the graph prediction problem to the maximum matching problem; (3) the design of an incremental early-stopping training strategy to prevent overfitting. Experimentally, our approach significantly improves results on examples that require structural generalization in the COGS dataset, a known challenging benchmark for compositional generalization. Overall, our results confirm that structural constraints are important for generalization in semantic parsing.
Fichier principal
Vignette du fichier
2023.emnlp-main.69.pdf (344.86 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04382463 , version 1 (09-01-2024)

Identifiants

Citer

Alban Petit, Caio Corro, François Yvon. Structural generalization in COGS: Supertagging is (almost) all you need. Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, Dec 2023, Singapour, Singapore. pp.1089-1101, ⟨10.18653/v1/2023.emnlp-main.69⟩. ⟨hal-04382463⟩
228 Consultations
64 Téléchargements

Altmetric

Partager

More