File tree 1 file changed +23
-0
lines changed
1 file changed +23
-0
lines changed Original file line number Diff line number Diff line change @@ -29,6 +29,29 @@ CuPy and pynvrtc needed to compile the CUDA code into a callable function at run
29
29
<br >
30
30
31
31
## Examples
32
+ The usage of SRU is the similar to ` nn.LSTM ` .
33
+ ``` python
34
+ import torch
35
+ from cuda_functional import SRU , SRUCell
36
+
37
+ # input has length 20, batch size 32 and dimension 128
38
+ x = torch.FloatTensor(20 , 32 , 128 ).cuda()
39
+
40
+ rnn = SRU(input_size, hidden_size,
41
+ num_layers = 2 , # number of stacking RNN layers
42
+ dropout = 0.0 , # dropout applied between RNN layers
43
+ rnn_dropout = 0.0 , # variational dropout applied on linear transformation (Wx)
44
+ use_tanh = 1 , # use tanh or identity activation
45
+ bidirectional = False # bidirectional RNN ?
46
+ )
47
+
48
+ output, hidden = rnn(x) # forward pass
49
+
50
+ # output is (length, batch size, hidden size * number of directions)
51
+ # hidden is (layers, batch size, hidden size * number of directions)
52
+
53
+ ```
54
+
32
55
- [ classification] ( /classification/ )
33
56
- [ question answering (SQuAD)] ( /DrQA/ )
34
57
- [ language modelling on PTB] ( /language_model/ )
You can’t perform that action at this time.
0 commit comments