NLPExplorer
Papers
Venues
Authors
Authors Timeline
Field of Study
URLs
ACL N-gram Stats
TweeNLP
API
Team
When and Why Are Pre-Trained Word Embeddings Useful for Neural Machine Translation?
Ye Qi
|
Devendra Sachan
|
Matthieu Felix
|
Sarguna Padmanabhan
|
Graham Neubig
|
Paper Details:
Month: June
Year: 2018
Location: New Orleans, Louisiana
Venue:
NAACL |
Citations
URL
Rapid Adaptation of Neural Machine Translation to New Languages
Graham Neubig
|
Junjie Hu
|
https://github.com/neulab/word-embeddings-for-nmt
http://www.statmt.org/wmt17/
https://www.ted.com/participate/translate
https://github.com/moses-smt/mosesdecoder/blob/
https://github.com/neulab/xnmt/
https://github.com/facebookresearch/fastText/
https://dumps.wikimedia.org/
https://github.com/neubig/util-scripts/
https://arxiv.org/abs/1409.0473
https://books.google.com/
Field Of Study
Linguistic Trends
Embeddings
Task
Tagging
Machine Translation
Language
Multilingual
English
Similar Papers
Creating a Large Multi-Layered Representational Repository of Linguistic Code Switched Arabic Data
Mona Diab
|
Mahmoud Ghoneim
|
Abdelati Hawwari
|
Fahad AlGhamdi
|
Nada AlMarwani
|
Mohamed Al-Badrashiny
|
Enriching Word Vectors with Subword Information
Piotr Bojanowski
|
Edouard Grave
|
Armand Joulin
|
Tomas Mikolov
|
A Comparative Study of Minimally Supervised Morphological Segmentation
Teemu Ruokolainen
|
Oskar Kohonen
|
Kairit Sirts
|
Stig-Arne Grönroos
|
Mikko Kurimo
|
Sami Virpioja
|
Knowledge-Rich Morphological Priors for Bayesian Language Models
Victor Chahuneau
|
Noah A. Smith
|
Chris Dyer
|
A Survey of Arabic Named Entity Recognition and Classification
Khaled Shaalan
|