This repository contains lecture notes, slides, assignments, and code for a university-level Natural Language Processing course. It spans core NLP topics such as language modeling, sequence tagging, parsing, semantics, and discourse, alongside modern machine learning methods used to solve them. Students work through programming exercises and problem sets that build intuition for both classical algorithms (like HMMs and CRFs) and neural approaches (like word embeddings and sequence models). The materials emphasize theory grounded in practical experimentation, often via Python notebooks or scripts that visualize results and encourage ablation studies. Clear organization and self-contained examples make it possible to follow along outside the classroom, using the repo as a self-study resource. For learners and instructors alike, the course provides a coherent path from foundational linguistics to current techniques, with reproducible code that makes concepts concrete.
Features
- Lecture notes covering both theoretical and empirical techniques in NLP
- Projects and assignments for hands-on experience with NLP models and algorithms
- Readings from literature to complement the technical content
- Grading/policy materials to structure the course
- Includes work on topics such as syntactic, semantic, and distributional models of language
- Provides examples of current or recent research problems in NLP for student engagement