Attention Mechanism のバックアップソース(No.5)

-[[Neural Machine Translation by Jointly Learning to Align and Translate&BR;Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio&BR;Submitted on 1 Sep 2014 (v1), last revised 19 May 2016 (this version, v7):https://arxiv.org/abs/1409.0473]]
-[[Attention Is All You Need&BR; (Vaswani et al, 2017):https://arxiv.org/abs/1706.03762]]
-[[論文解説 Attention Is All You Need (Transformer)&BR;ディープラーニングブログ| 2017-12-21:http://deeplearning.hatenablog.com/entry/transformer]]
-[[自然言語処理における、Attentionの耐えられない短さ&BR;Qiita:https://qiita.com/icoxfog417/items/f170666d81f773e4b1a7]]
-[[深層学習界の大前提Transformerの論文解説!&BR;Qiita:https://qiita.com/omiita/items/07e69aef6c156d23c538]]
-[[ざっくり理解する分散表現, Attention, Self Attention, Transformer&BR;Qiita:https://qiita.com/norihitoishida/items/2fead107792b504eaccf]]
-[[作って理解する Transformer / Attention&BR;Qiita|@halhorn|2018年12月05日に更新:https://qiita.com/halhorn/items/c91497522be27bde17ce]]
-[[Attentionで拡張されたRecurrent Neural Networks&BR;DeepAge:https://deepage.net/deep_learning/2017/03/03/attention-augmented-recurrent-neural-networks.html]]