Transformer

Transformer,基于Multi-head self-attention的Seq2Seq模型.

自注意力机制(Self-Attention Mechanism)

Self-Attention Mechanism.

记忆增强神经网络(Memory Augmented Neural Network)

Memory Augmented Neural Network.

序列到序列模型中的注意力机制(Attention Mechanism)

Attention Mechanism in Seq2Seq Models.

序列到序列模型(Sequence to sequence)

Sequence to sequence.