Skip to content

Commit 67d1997

Browse files
committed
d2l attention part 完结
1 parent a8b29cd commit 67d1997

File tree

5 files changed

+860
-6
lines changed

5 files changed

+860
-6
lines changed

1.txt

Lines changed: 0 additions & 1 deletion
This file was deleted.

2019_07_01/Attention/Translation With A Seq-To-Seq Network and Attention.ipynb

Lines changed: 10 additions & 5 deletions
Large diffs are not rendered by default.
Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
<!--
2+
* @Author: WANG Maonan
3+
* @Date: 2022-09-26 11:40:40
4+
* @Description: 注意力机制介绍
5+
* @LastEditTime: 2023-01-25 02:14:25
6+
-->
7+
# 文件介绍
8+
9+
## 注意力机制介绍
10+
11+
- 非参数注意力池化
12+
- - nonparametric_attention_pooling.ipynb
13+
- - nonparametric_attention_pooling.py
14+
- 参数化注意力机制
15+
- - parametric_attention_pooling.ipynb
16+
- - parametric_attention_pooling.py
17+
18+
## 注意力分数
19+
20+
- 加性注意力(Additive Attention)
21+
- - additive_attention.ipynb
22+
- - additive_attention.py
23+
- 内积注意力(Scaled Dot-Product Attention)
24+
- - scaled_dot_attention.ipynb
25+
- - scaled_dot_attention.py
26+
27+
## 使用注意力机制的seq2seq
28+
29+
- seq2seq_attention_example.py, 使用随机数据测试;
30+
- seq2seq_attention_data.py, 使用数据集测试, 可以使用这个文件来查看具体 decoder 中 attention 是如何做的.
31+
- seq2seq_attention.ipynb
32+
33+
## 多头注意力
34+
35+
- multihead_attention.py
36+
- multihead_attention.ipynb
37+
38+
## 自注意力
39+
40+
- selfAttention_and_positionEncoding.ipynb
41+
42+
## Transformer 结构
43+
44+
- transformer.py
45+
- transformer.ipynb
46+
47+

0 commit comments

Comments
 (0)