Integrating Translation Memories into Non-Autoregressive Machine Translation - Laboratoire Interdisciplinaire des Sciences du Numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Integrating Translation Memories into Non-Autoregressive Machine Translation

Jitao Xu
  • Fonction : Auteur
  • PersonId : 184998
  • IdHAL : xujitao
Josep Crego
  • Fonction : Auteur
  • PersonId : 1178167
François Yvon

Résumé

Non-autoregressive machine translation (NAT) has recently made great progress. However, most works to date have focused on standard translation tasks, even though some edit-based NAT models, such as the Levenshtein Transformer (LevT), seem well suited to translate with a Translation Memory (TM). This is the scenario considered here. We first analyze the vanilla LevT model and explain why it does not do well in this setting. We then propose a new variant, TM-LevT, and show how to effectively train this model. By modifying the data presentation and introducing an extra deletion operation, we obtain performance that are on par with an autoregressive approach, while reducing the decoding load. We also show that incorporating TMs during training dispenses to use knowledge distillation, a well-known trick used to mitigate the multimodality issue.
Fichier principal
Vignette du fichier
eacl2023_arxiv.pdf (302.9 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
licence : CC BY - Paternité

Dates et versions

hal-03995339 , version 1 (18-02-2023)

Licence

Paternité

Identifiants

  • HAL Id : hal-03995339 , version 1

Citer

Jitao Xu, Josep Crego, François Yvon. Integrating Translation Memories into Non-Autoregressive Machine Translation. EACL 2023, May 2023, Dubrovnik, Croatia. ⟨hal-03995339⟩
137 Consultations
137 Téléchargements

Partager

Gmail Facebook X LinkedIn More