• Medientyp: E-Artikel
  • Titel: Decoding Methods in Neural Language Generation: A Survey
  • Beteiligte: Zarrieß, Sina; Voigt, Henrik; Schüz, Simeon
  • Erschienen: MDPI AG, 2021
  • Erschienen in: Information, 12 (2021) 9, Seite 355
  • Sprache: Englisch
  • DOI: 10.3390/info12090355
  • ISSN: 2078-2489
  • Schlagwörter: Information Systems
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: <jats:p>Neural encoder-decoder models for language generation can be trained to predict words directly from linguistic or non-linguistic inputs. When generating with these so-called end-to-end models, however, the NLG system needs an additional decoding procedure that determines the output sequence, given the infinite search space over potential sequences that could be generated with the given vocabulary. This survey paper provides an overview of the different ways of implementing decoding on top of neural network-based generation models. Research into decoding has become a real trend in the area of neural language generation, and numerous recent papers have shown that the choice of decoding method has a considerable impact on the quality and various linguistic properties of the generation output of a neural NLG system. This survey aims to contribute to a more systematic understanding of decoding methods across different areas of neural NLG. We group the reviewed methods with respect to the broad type of objective that they optimize in the generation of the sequence—likelihood, diversity, and task-specific linguistic constraints or goals—and discuss their respective strengths and weaknesses.</jats:p>
  • Zugangsstatus: Freier Zugang