Open-source implementations of popular deep learning techniques with applications to NLP. While this is research code (and usual caveats apply), it has been used in industrial teams outside of our group, and licenses are included in the repos.


HarvardNLP + Systran.
github data
Hendrik Strobelt and Sebastian Gehrmann.
github models
Sequence-to-Sequence with Attention
Yoon Kim.
github data models
CNN for Text Clasification
Jeffrey Ling (based on code by Yoon Kim).
Yuntian Deng.
ABS: Abstractive Summarization
Alexander Rush.
github data
LSTM Character-Aware Language Model
Yoon Kim.
github data
Neural Coreference Resolution
Sam Wiseman.
PAD: Phrase-structure After Dependencies
Alexander Rush and Lingpeng Kong.
Android Translation
Alexander Rush and Yoon Kim.
github models


If you are new to learning Torch we have a set of tutorial prepared as part of CS287 a graduate class on ML in NLP. These notebooks, prepared by Sam Wiseman and Saketh Rama, assume basic familiarity with the core aspects of Torch, and move quickly to advanced topics such memory usage, the details of the nn module, and recurrent neural networks.

General Torch Notes
Torch nn module Notes
Torch rnn Notes
Torch and Dynamic Programming