Transformer
Transformer,基于Multi-head self-attention的Seq2Seq模型.
Transformer,基于Multi-head self-attention的Seq2Seq模型.
Self-Attention Mechanism.
Memory Augmented Neural Network.
Attention Mechanism in Seq2Seq Models.
Sequence to sequence.
Capsule Networks.