|
2 | 2 |
|
3 | 3 | This repository aims for comparative analysis of TensorFlow vs PyTorch, for those who want to learn TensorFlow while already familiar with PyTorch or vice versa.
|
4 | 4 |
|
5 |
| -The whole content was written in Ipython Notebook then converted into MarkDown. Ipython Notebooks in main directory cotains the same content. |
6 |
| - |
7 | 5 | ## TABLE OF CONTENTS
|
8 | 6 |
|
9 | 7 | [**01. Tensor**](https://github.com/tango4j/tensorflow-vs-pytorch#01-tensor)
|
@@ -130,53 +128,3 @@ The whole content was written in Ipython Notebook then converted into MarkDown.
|
130 | 128 |
|
131 | 129 | - There are a few distinct differences between Tensorflow and Pytorch when it comes to data compuation.
|
132 | 130 |
|
133 |
| -| | TensorFlow | PyTorch | |
134 |
| -|---------------|-----------------|----------------| |
135 |
| -| Framework | Define-and-run | Define-by-run | |
136 |
| -| Graph | Static | Dynamic| |
137 |
| -| Debug | Non-native debugger (tfdbg) |pdb(ipdb) Python debugger| |
138 |
| - |
139 |
| -**How "Graph" is defined in each framework?** |
140 |
| - |
141 |
| -#**TensorFlow:** |
142 |
| - |
143 |
| -- Static graph. |
144 |
| - |
145 |
| -- Once define a computational graph and excute the same graph repeatedly. |
146 |
| - |
147 |
| -- Pros: |
148 |
| - |
149 |
| - (1) Optimizes the graph upfront and makes better distributed computation. |
150 |
| - |
151 |
| - (2) Repeated computation does not cause additional computational cost. |
152 |
| - |
153 |
| - |
154 |
| -- Cons: |
155 |
| - |
156 |
| - (1) Difficult to perform different computation for each data point. |
157 |
| - |
158 |
| - (2) The structure becomes more complicated and harder to debug than dynamic graph. |
159 |
| - |
160 |
| - |
161 |
| -#**PyTorch:** |
162 |
| - |
163 |
| -- Dynamic graph. |
164 |
| - |
165 |
| -- Does not define a graph in advance. Every forward pass makes a new computational graph. |
166 |
| - |
167 |
| -- Pros: |
168 |
| - |
169 |
| - (1) Debugging is easier than static graph. |
170 |
| - |
171 |
| - (2) Keep the whole structure concise and intuitive. |
172 |
| - |
173 |
| - (3) For each data point and time different computation can be performed. |
174 |
| - |
175 |
| - |
176 |
| -- Cons: |
177 |
| - |
178 |
| - (1) Repetitive computation can lead to slower computation speed. |
179 |
| - |
180 |
| - (2) Difficult to distribute the work load in the beginning of training. |
181 |
| - |
182 |
| - |
0 commit comments