Skip to content

Commit fa1cfb6

Browse files
committed
added link to pset
1 parent d8c5812 commit fa1cfb6

File tree

1 file changed

+12
-0
lines changed

1 file changed

+12
-0
lines changed

README.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,18 @@ The problem set uses some advanced techniques. The intention of this tutorial i
2626
The aim is to start with the basics and move up to linguistic structure prediction, which I feel is almost completely absent in other Pytorch tutorials.
2727
The general deep learning basics have short expositions. Topics more NLP-specific received more in-depth discussions, although I have referred to other sources when I felt a full description would be reinventing the wheel and take up too much space.
2828

29+
### Dependency Parsing Problem Set
30+
31+
As mentioned above, [here](https://github.com/jacobeisenstein/gt-nlp-class/tree/master/psets/ps4) is the problem set that goes through implementing
32+
a high-performing dependency parser in Pytorch. I wanted to add a link here since it might be useful, provided you ignore the things that were specific to the class.
33+
A few notes:
34+
35+
* There is a lot of code, so the beginning of the problem set was mainly to get people familiar with the way my code represented the relevant data, and the interfaces you need to use. The rest of the problem set is actually implementing components for the parser. Since we hadn't done deep learning in the class before, I tried to provide an enormous amount of comments and hints when writing it.
36+
* There is a unit test for every deliverable, which you can run with nosetests.
37+
* Since we use this problem set in the class, please don't publically post solutions.
38+
* The same repo has some notes that include a section on shift-reduce dependency parsing, if you are looking for a written source to complement the problem set.
39+
* The link above might not work if it is taken down at the start of a new semester.
40+
2941
# References:
3042
* I learned a lot about deep structure prediction at EMNLP 2016 from [this](https://github.com/clab/dynet_tutorial_examples) tutorial on [Dynet](http://dynet.readthedocs.io/en/latest/), given by Chris Dyer and Graham Neubig of CMU and Yoav Goldberg of Bar Ilan University. Dynet is a great package, especially if you want to use C++ and avoid dynamic typing. The final BiLSTM CRF exercise and the character-level features exercise are things I learned from this tutorial.
3143
* A great book on structure prediction is [Linguistic Structure Prediction](https://www.amazon.com/Linguistic-Structure-Prediction-Synthesis-Technologies/dp/1608454053/ref=sr_1_1?ie=UTF8&qid=1489510387&sr=8-1&keywords=Linguistic+Structure+Prediction) by Noah Smith. It doesn't use deep learning, but that is ok.

0 commit comments

Comments
 (0)