The application of neural network methods---under the name "deep learning"---has led to breakthroughs in a wide range of fields, including in building language technologies (e.g. for search, translation, text input prediction). This course will provide a hands-on introduction to the use of deep learning methods for processing natural language. Methods to be covered include static word embeddings, feed-forward networks for text, recurrent neural networks, transformers, pre-training and transfer learning, with applications including sentiment analysis, translation, and generation.
|Monday and Wednesday||1:00 - 2:20 PM||https://washington.zoom.us/j/98069618336|
Note: while lectures will be delivered live at the above time and location, they will also be recorded and posted to the course Canvas page.
|Instructor||Shane Steinert-Threlkeld||https://washington.zoom.us/my/shanest||Wednesday 3-5PM|
|Teaching Assistant||C.M. Downey||https://washington.zoom.us/j/93180360145
[password on Canvas]
Wednesday 11:15AM - 12:15PM
While relevant readings are posted in the schedule below, the following are very good general resources. Names that are used to refer to these works are included in parentheses.
N.B.: All homework grading will take place on the patas cluster using Condor, so your code must run there. I strongly encourage you to ensure you have an account set up by the time of the first course meeting.
This quarter is unprecedented, as we navigate an ever-changing world due to the COVID-19 pandemic. Stress and anxiety are at all-time highs due to this, as well as other world events. If you find yourself struggling with a difficult concept; stressed over politics or health; slowed by monopolistic internet providers; or annoyed at a classmate, please remember that they feel similar. Maybe not in your very moment, but certainly recently or soon. Some of you may find remote learning particularly conducive to your style of learning and personality. Others will find it difficult to concentrate and maintain enthusiasm. These are all normal reactions.
If you find yourself having trouble learning in class, please do not hesitate to let one of us know. Our goal is to make this class a bright spot in these unprecedented times, and to do whatever we can to promote a healthy learning environment for all.
All deadlines and meeting times for this class are in "Pacific Time". Now that we are in Daylight Savings Time, this is UTC-7.
As per the policy above, all communication outside of the classroom should take place on Canvas. You can expect responses from teaching staff within 48 hours, but only during normal business hours, and excluding weekends.
N.B.: while CLMS students have a private Slack channel, I strongly encourage questions concerning course content and assignments to be posted to the Canvas discussion board, for two reasons. (i) Teaching staff will not look at Slack, so misinformation can spread. (ii) Not every student in the course is in the CLMS program, but they deserve to be included in course discussions and likely have many of the same questions.
Washington state law requires that UW develop a policy for accommodation of student absences or significant hardship due to reasons of faith or conscience, or for organized religious activities. The UW’s policy, including more information about how to request an accommodation, is available at Religious Accommodations Policy (https://registrar.washington.edu/staffandfaculty/religious-accommodations-policy/). Accommodations must be requested within the first two weeks of this course using the Religious Accommodations Request form (https://registrar.washington.edu/students/religious-accommodations-request/).
Your experience in this class is important to me. If you have already established accommodations with Disability Resources for Students (DRS), please communicate your approved accommodations to me at your earliest convenience so we can discuss your needs in this course.
If you have not yet established services through DRS, but have a temporary health condition or permanent disability that requires accommodations (conditions include but not limited to; mental health, attention-related, learning, vision, hearing, physical or health impacts), you are welcome to contact DRS at 206-543-8924 or firstname.lastname@example.org or disability.uw.edu. DRS offers resources and coordinates reasonable accommodations for students with disabilities and/or temporary health conditions. Reasonable accommodations are established through an interactive process between you, your instructor(s) and DRS. It is the policy and practice of the University of Washington to create inclusive and accessible learning environments consistent with federal and state law.
Call SafeCampus at 206-685-7233 anytime – no matter where you work or study – to anonymously discuss safety and well-being concerns for yourself or others. SafeCampus’s team of caring professionals will provide individualized support, while discussing short- and long-term solutions and connecting you with additional resources when requested.
|Date||Topics + Slides||Readings||Events|
|Mar 29||Introduction / Overview; History|
|Mar 31||Gradient descent; Word vectors||
JM ch 5.4, ch 6
YG ch 2
[due Apr 8]
|Apr 5||Word vectors; Classification and language modeling||JM 6.8 - 6.12|
|Apr 7||Neural Networks 1||
JM 7.1 - 7.3
YG ch 4
[due Apr 15]
|Apr 12||Computation graphs; backpropagation||
JM 7.4.3 - 7.4.5
YG 5.1.1 - 5.1.2
GBC ch 6.5
Calculus on computational graphs
CS 231n notes 1
CS 231n notes 2 (vector/tensor derivatives)
Yes, you should understand backprop
|Apr 14||Feed-forward networks for LM and classification||
YG ch 9
A Neural Probabilistic Language Model (Bengio et al 2003)
Deep Unordered Composition Rivals Syntactic Methods for Text Classification (Iyyer et al 2015)
[due Apr 22]
|Apr 19||Recurrent neural networks||
The Unreasonable Effectiveness of Recurrent Neural Networks
|Apr 21||Vanishing gradients; RNN variants||
YG ch 15
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
On the difficulty of training recurrent neural networks
[due Apr 29]
|Apr 26||Sequence-to-sequence; Attention||
JM ch 11
Sequence to Sequence Learning with Neural Networks (original seq2seq paper)
Neural Machine Translation by Jointly Learning to Align and Translate (original seq2seq + attention paper)
|Apr 28||Transformers 1||
Attention is All You Need (original Transformer paper)
The Annotated Transformer
The Illustrated Transformer
[due May 6]
|May 3||Transformers 2||"|
|May 5||Pre-training / fine-tuning paradigm||
Contextual Word Representations: Putting Words into Computers
The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
[due May 13]
|May 10||Interpretability and Analysis||
Analysis Methods in Natural Language Processing
A Primer in BERTology
|May 12||Other architectures (CNN, recursive NNs, ...)||
YG ch 13, 18
Convolutional Neural Networks for Sentence Classification
Understanding Convolutional Neural Networks for Text Classification
Language Modeling with Gated Convolutional Networks
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
[due May 20]
|May 17||Low-resource / Multilingual NLP
Guest lecture: C.M. Downey
|May 19||Ethics, fairness, limitations
Guest lecture: Angelina McMillan-Major
|On the Dangers of Stochastic Parrots||
[due May 27]
|May 24||Multimodal NLP
Guest lecture: Yonatan Bisk
|May 26||Overflow day||
[due June 3]
|May 31||Memorial Day: No Class|
|Jun 2||Summary / Review|