For more information on my research, please see the website for my lab: CLMBR (Computation, Langauge and Meaning Band of Researchers), my C.V., and/or Google Scholar.
- LING 571: Deep Processing Techniques for Natural Language Processing. [Aut '19, Aut '20, Aut '21, Aut '22, Aut '23, Aut '24]
- LING 572: Advanced Statistical Methods for Natural Language Processing. [Win '20]
- LING 574: Deep Learning for NLP [Spr '21, Spr '22, Spr '24]
- LING 575: Topics in Computational Linguistics
- Montreean: visualize constituency parse trees vaguely in the style of Piet Mondrian.
- edugrad: a minimal re-implementation of the PyTorch API (dynamic computation graphs + backpropagation), designed for pedagogical purposes
- Tutorial introduction to neural networks: slides and Jupyter notebook, explaining and introducing neural networks. Includes a worked example with quantifiers (in PyTorch) and additional practical advice. (Largely supersedes the tutorial linked below, though that one can still be useful for understanding TensorFlow's estimators interfactor.)
- Decision and Game Theory for AI: lecture notes (in the form of Jupyter Notebooks) for a mini-course on decision and game theory for undergraduates in AI. Includes simple implementations of things like CDT/EDT, replicator dynamic, reinforcement learning in signaling games.
- Generate Dot Arrays for Psycholinguistic Experiments: Python script for generating colored dot arrays, including the four stimulus type from Pietroski et al 2009, "Psychosemantics of `most'".
- Joyce's Argument for Probabilism: this is a Mathematica notebook that carries out the construction in the proof of Joyce's most general argument for probabilism from his 2009 "Accuracy and Coherence: Prospects for an Alethic Epistemology of Partial Belief"
...
You can reach me via snail or electronic mail:
Guggenheim Hall, room 415K
Seattle, Washington 98195
shanest AT uw DOT edu