NLPExplorer
Papers
Venues
Authors
Authors Timeline
Field of Study
URLs
ACL N-gram Stats
TweeNLP
API
Team
How Much Attention Do You Need? A Granular Analysis of Neural Machine Translation Architectures
Tobias Domhan
|
Paper Details:
Month: July
Year: 2018
Location: Melbourne, Australia
Venue:
ACL |
Citations
URL
Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures
Gongbo Tang
|
Mathias Müller
|
Annette Rios
|
Rico Sennrich
|
An Analysis of Attention Mechanisms: The Case of Word Sense Disambiguation in Neural Machine Translation
Gongbo Tang
|
Rico Sennrich
|
Joakim Nivre
|
https://github.com/tensorflow/tensor2tensor
https://github.com/awslabs/sockeye/tree/acl18
https://github.com/moses-smt/mosesdecoder/
https://github.com/tensorflow/tensor2tensor/blob/
Field Of Study
Linguistic Trends
Embeddings
Task
Machine Translation
Language
English
Dataset
News
Similar Papers
Nonparametric Word Segmentation for Machine Translation
ThuyLinh Nguyen
|
Stephan Vogel
|
Noah A. Smith
|
A treebank-based study on the influence of Italian word order on parsing performance
Anita Alicante
|
Cristina Bosco
|
Anna Corazza
|
Alberto Lavelli
|
Integrating Graph-Based and Transition-Based Dependency Parsers
Joakim Nivre
|
Ryan McDonald
|
Extending Statistical Machine Translation with Discriminative and Trigger-Based Lexicon Models
Arne Mauser
|
Saša Hasan
|
Hermann Ney
|
Improving Arabic-Chinese Statistical Machine Translation using English as Pivot Language
Nizar Habash
|
Jun Hu
|