Losing heads in the lottery: Pruning transformer attention in neural machine translation (2020)
Attributed to:
Peta-5: A National Facility for Petascale Data Intensive Computation and Analytics
funded by
EPSRC
Abstract
No abstract provided
Bibliographic Information
Type: Other
Parent Publication: EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference