Skip to Main content Skip to Navigation
Conference papers

Latent Group Dropout for Multilingual and Multidomain Machine Translation

Minh Quang Pham 1, 2 Josep Crego 2 François Yvon 1 
1 TLP - Traitement du Langage Parlé
LIMSI - Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur
Abstract : Multidomain and multilingual machine translation often rely on parameter sharing strategies, where large portions of the network are meant to capture the commonalities of the tasks at hand, while smaller parts are reserved to model the peculiarities of a language or a domain. In adapter-based approaches, these strategies are hardcoded in the network architecture, independent of the similarities between tasks. In this work, we propose a new method to better take advantage of these similarities, using a latent-variable model. We also develop new techniques to train this model end-to-end and report experimental results showing that the learned patterns are both meaningful and yield improved translation performance without any increase of the model size.
Document type :
Conference papers
Complete list of metadata
Contributor : François Yvon Connect in order to contact the contributor
Submitted on : Monday, July 11, 2022 - 11:34:52 PM
Last modification on : Tuesday, August 9, 2022 - 4:14:51 PM


Publisher files allowed on an open archive


  • HAL Id : hal-03720395, version 1


Minh Quang Pham, Josep Crego, François Yvon. Latent Group Dropout for Multilingual and Multidomain Machine Translation. Findings of the ACL: NAACL 2022, Association for Computational Linguistics, Jul 2022, Seattle, United States. ⟨hal-03720395⟩



Record views


Files downloads