winter-2019
课程资料:
学习笔记参考:
CS224n-2019 学习笔记
斯坦福CS224N深度学习自然语言处理2019冬学习笔记目录
参考书:
神经网络相关的基础:
Lecture 01: Introduction and Word Vectors
Human language and word meaning (15 mins)
Word2vec introduction (15 mins)
Word2vec objective function gradients (25 mins)
Optimization basics (5 mins)
Looking at word vectors (10 mins or less)
课件
Suggested Readings
参考阅读
Assignment 1:Exploring Word Vectors
[code] [preview]
笔记整理
Lecture 02: Word Vectors 2 and Word Senses
Finish looking at word vectors and word2vec (12 mins)
Optimization basics (8 mins)
Can we capture this essence more effectively by counting? (15m)
The GloVe model of word vectors (10 min)
Evaluating word vectors (15 mins)
课件
Suggested Readings
Additional Readings:
参考阅读
Python review[slides]
review
glove的思想、算法步骤分解、代码
评估词向量的方法
Lecture 03: Word Window Classification, Neural Networks, and Matrix Calculus
Course information update (5 mins)
Classification review/introduction (10 mins)
Neural networks introduction (15 mins)
Named Entity Recognition (5 mins)
Binary true vs. corrupted word window classification (15 mins)
Matrix calculus introduction (20 mins)
课件
Suggested Readings:
Additional Readings:
Assignment 2
[code] [handout]
review
NER
梯度
Lecture 04: Backpropagation and Computation Graphs
Matrix gradients for our simple neural net and some tips [15 mins]
Computation graphs and backpropagation [40 mins]
Stuff you should know [15 mins]
a. Regularization to prevent overfitting
b. Vectorization
c. Nonlinearities
d. Initialization
e. Optimizers
f. Learning rates
课件
Suggested Readings:
Lecture 05: Linguistic Structure: Dependency Parsing
Syntactic Structure: Consistency and Dependency (25 mins)
Dependency Grammar and Treebanks (15 mins)
Transition-based dependency parsing (15 mins)
Neural dependency parsing (15 mins)
cs224n-2019-lecture05-dep-parsing [scrawled-on slides]
cs224n-2019-notes04-dependencyparsing
Suggested Readings:
Assignment 3
[code] [handout]
Lecture 06: The probability of a sentence? Recurrent Neural Networks and Language Models
Recurrent Neural Networks (RNNs) and why they’re great for Language Modeling (LM).
cs224n-2019-lecture06-rnnlm
cs224n-2019-notes05-LM_RNN
Suggested Readings:
Lecture 07: Vanishing Gradients and Fancy RNNs
Problems with RNNs and how to fix them
More complex RNN variants
cs224n-2019-lecture07-fancy-rnn
cs224n-2019-notes05-LM_RNN
Suggested Readings:
Assignment 4
[code] [handout] [Azure Guide] [Practical Guide to VMs]
Lecture 08: Machine Translation, Seq2Seq and Attention
How we can do Neural Machine Translation (NMT) using an RNN based architecture called sequence to sequence with attention
cs224n-2019-lecture08-nmt
机器翻译:
2.1990s-2010s,基于统计的机器翻译(SMT),从数据中学习统计模型,贝叶斯规则,考虑翻译和句子语法流畅。对齐:一对多,多对一,多对多。
3.2014-,基于神经网络的机器翻译(NMT),seq2seq,两个RNNs。seq2seq任务有:总结(长文本到短文本),对话,解析,代码生成(自然语言到代码)。贪心解码。束搜索解码
评估方式:BLEU(Bilingual Evaluation Understudy)
未解决的问题:词汇表之外的词,领域不匹配,保持较长文本的上下文,低资源语料少,没有加入常识,从训练数据中学到了偏见,无法解释的翻译,
cs224n-2019-notes06-NMT_seq2seq_attention
Suggested Readings:
Lecture 09: Practical Tips for Final Projects
Final project types and details; assessment revisited
Finding research topics; a couple of examples
Review of gated neural sequence models
Presenting your results and evaluation
cs224n-2019-lecture09-final-projects
数据:
Look at Kaggle,research papers,lists of datasets
final-project-practical-tips
Suggested Readings:
Lecture 10: Question Answering and the Default Final Project
Final final project notes, etc.
The Stanford Attentive Reader model
Recent, more advanced architectures
cs224n-2019-lecture10-QA
两个部分:寻找那些可能包含答案的文档(信息检索),从文档或段落中找答案(阅读理解)
阅读理解的历史,2013年MCTest:P+Q——>A,2015/16:CNN/DM、SQuAD数据集
开放领域问答的历史:1964年是依赖解析和匹配,1993年线上百科全书,1999年设立TREC问答,2011年IBM的DeepQA系统,2016年用神经网络和信息检索IR
斯坦福的简单模型:Attentive Reader model,预测回答文本的起始位置和结束位置
Project Proposal
[instructions]
Default Final Project
[handout] [code]
Lecture 11: ConvNets for NLP
Simple CNN for Sentence Classification: Yoon (2014) (20 mins)
Deep CNN for Sentence Classification: Conneau et al. (2017)
(10 mins)
Quasi-recurrent Neural Networks (10 mins)
cs224n-2019-lecture11-convnets
cs224n-2019-notes08-CNN
Suggested Readings:
A tiny bit of linguistics (10 mins)
Purely character-level models (10 mins)
Subword-models: Byte Pair Encoding and friends (20 mins)
Hybrid character and word level models (30 mins)
cs224n-2019-lecture12-subwords
Assignment 5
[original code (requires Stanford login) / public version] [handout]
Lecture 13: Modeling contexts of use: Contextual Representations and Pretraining
[slides] [video]
Suggested readings:
[slides] [video]
Suggested readings:
Project Milestone
[instructions]
Lecture 15: Natural Language Generation
[slides] [video]
Lecture 16: Reference in Language and Coreference Resolution
[slides] [video]
Lecture 17: Multitask Learning: A general model for NLP? (guest lecture by Richard Socher)
[slides] [video]
Lecture 18: Constituency Parsing and Tree Recursive Neural Networks
[slides] [video] [notes]
Suggested Readings:
Lecture 19: Safety, Bias, and Fairness (guest lecture by Margaret Mitchell)
[slides] [video]
Lecture 20: Future of NLP + Deep Learning
[slides] [video]
Final project poster session [details]
Final Project Report due [instructions]
Project Poster/Video due [instructions]