Scalable cross-lingual transfer of neural sentence embeddings

Hanan Aldarmaki, The George Washington University & Mohamed bin Zayed University of Artificial Intelligence
Mona Diab, The George Washington University

Abstract

We develop and investigate several crosslingual alignment approaches for neural sentence embedding models, such as the supervised inference classifier, InferSent, and sequential encoder-decoder models. We evaluate three alignment frameworks applied to these models: joint modeling, representation transfer learning, and sentence mapping, using parallel text to guide the alignment. Our results support representation transfer as a scalable approach for modular cross-lingual alignment of neural sentence embeddings, where we observe better performance compared to joint models in intrinsic and extrinsic evaluations, particularly with smaller sets of parallel data. © 2019 Association for Computational Linguistics