Exploring Unsupervised Pretraining Objectives for Machine Translation (2021)
Attributed to:
Peta-5: A National Facility for Petascale Data Intensive Computation and Analytics
funded by
EPSRC
Abstract
No abstract provided
Bibliographic Information
Type: Other
Parent Publication: Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021