自回归的序列建模,两种序列属于不同的空间。
两篇文章:
- Seq2Seq & Attention - Neural Machine Translation by Jointly Learning to Align and Translate
- Translate & Alignment
- Seq2Seq & Local Attention - Effective Approaches to Attention-based Neural Machine Translation
- content-based function,考虑内容和位置(编码器和解码器)
- location-based function,只考虑位置(解码器)
- Monotonic、Predictive、Gaussian distribution
Neural Machine Translation by Jointly Learning to Align and Translate
机器翻译:machine translation
MNT:Neutral Machine Translation,神经网络机器翻译
Encoder - Decoder,Stop Token
SOTA:English 2 French Translation,fixed-length vector -> soft-alignment
align and translate
Translate,