Fokusthemen: Maschinelle Sprachverarbeitung und Maschinelles Lernen, Digitale Linguistik
Publikationen
ZORA Publikationsliste
Download-Optionen
Publikationen
-
Understanding Pure Character-Based Neural Machine Translation: The Case of Translating Finnish into English 4251–4262. https://www.aclweb.org/anthology/2020.coling-main.375
-
Zero-Shot Crosslingual Sentence Simplification 5109–5126. https://www.aclweb.org/anthology/2020.emnlp-main.415
-
Subword segmentation and a single bridge language affect zero-shot neural machine translation 526–535. http://www.statmt.org/wmt20/pdf/2020.wmt-1.64.pdf
-
On Romanization for Model Transfer Between Scripts in Neural Machine Translation 2461–2469. https://www.aclweb.org/anthology/2020.findings-emnlp.223
-
Detecting Word Sense Disambiguation Biases in Machine Translation for Model-Agnostic Adversarial Attacks 7635–7653. https://www.aclweb.org/anthology/2020.emnlp-main.616
-
Adaptive Feature Selection for End-to-End Speech Translation 2533–2544. https://www.aclweb.org/anthology/2020.findings-emnlp.230
-
Domain robustness in neural machine translation Proceedings of the 14th Conference of the Association for Machine Translation in the Americas, 151–164. https://www.aclweb.org/anthology/2020.amta-research.14
-
In Neural Machine Translation, What Does Transfer Learning Transfer? 7701–7710. https://www.aclweb.org/anthology/2020.acl-main.688
-
On Exposure Bias, Hallucination and Domain Shift in Neural Machine Translation 3544–3552. https://www.aclweb.org/anthology/2020.acl-main.326
-
Improving Massively Multilingual Neural Machine Translation and Zero-Shot Translation 1628–1639. https://www.aclweb.org/anthology/2020.acl-main.148
-
X-Stance: A Multilingual Multi-Target Dataset for Stance Detection 5th Swiss Text Analytics Conference (SwissText) & 16th Conference on Natural Language Processing (KONVENS), Zurich. http://ceur-ws.org/Vol-2624/paper9.pdf
-
A Set of Recommendations for Assessing Human–Machine Parity in Language Translation Journal of Artificial Intelligence Research, 67, 653–672. https://doi.org/10.1613/jair.1.11371
-
Root Mean Square Layer Normalization Advances in Neural Information Processing Systems 32. Advances in Neural Information Processing Systems 32, Vancouver. http://papers.nips.cc/paper/9403-root-mean-square-layer-normalization.pdf
-
Improving Deep Transformer with Depth-Scaled Initialization and Merged Attention 897–908. https://www.aclweb.org/anthology/D19-1083.pdf
-
The Bottom-up Evolution of Representations in the Transformer: A Study with Machine Translation and Language Modeling Objectives 4387–4397. https://www.aclweb.org/anthology/D19-1448.pdf
-
Context-Aware Monolingual Repair for Neural Machine Translation 876–885. https://www.aclweb.org/anthology/D19-1081.pdf
-
Encoders Help You Disambiguate Word Senses in Neural Machine Translation 1429–1435. https://doi.org/10.18653/v1/D19-1149
-
Samsung and University of Edinburgh’s System for the IWSLT 2019 16th International Workshop on Spoken Language Translation 2019, Hong Kong. https://doi.org/10.5281/zenodo.3525537
-
Understanding Neural Machine Translation by Simplification: The Case of Encoder-free Models (G. Angelova, R. Mitkov, I. Nikolova, & I. Temnikova, Eds.; pp. 1186–1193). INCOMA. https://doi.org/10.26615/978-954-452-056-4_136