Commit ca94165d by Administrator

Add readme.md

parents
- ```直播-Lecture```: 核心的知识点
- ```直播-Discussion```: 代码实战、复习课、论文讲解、主题分享等
| 日期 | 主题 | 知识点详情 | 课件 | 相关阅读 | 其 他 | 作业 |
|---------|---------|---------|---------|---------|---------|---------|
| PART 0 前期基础复习(机器学习与凸优化)|
| 2月22日(周六) 10:00AM | (直播-Lecture1) <br>算法复杂度,动态规划,DTW | <br>时间/空间复杂度的分析、 <br>Master's Theorem(主定理), <br> v递归程序的时间复杂度 <br>与空间复杂度、 <br>动态规划算法介绍、 <br>编辑距离的计算、 <br>DTW的技术与应用, <br>Hamming Distance以及 <br>Semantic Hashing |[Lecture1](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/slide-clear%20%200222Lecture1.pptx) | [[博客]十分钟搞定时间复杂度](https://www.jianshu.com/p/f4cca5ce055a)<br/>[[博客] Dynamic Programming – Edit Distance Problem](https://algorithms.tutorialhorizon.com/dynamic-programming-edit-distance-problem/)<br/>[[材料]Master's Theorem](http://people.csail.mit.edu/thies/6.046-web/master.pdf)<br/>[Introduction to Algorithm (MIT Press)](http://ressources.unisciel.fr/algoprog/s00aaroot/aa00module1/res/%5BCormen-AL2011%5DIntroduction_To_Algorithms-A3.pdf)<br/>|[如何写summary](http://47.94.6.102/NLPCamp6/course-info/wikis/%E5%A6%82%E4%BD%95%E5%86%99summary)|[第一篇论文阅读](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0308%E8%AE%BA%E6%96%871.pdf)截止日期:3月8日(周日)北京时间 23:59PM,填写到<br>[论文summary](https://shimo.im/sheets/JVp8GYP9G8TjhjyG/ )|
| 2月29日(周六) 10:00AM |(直播-Lecture2) <br>逻辑回归与正则|逻辑回归模型 <br>,GD, SGD,Distributed SGD, <br>过拟合与正则, <br>L1,L2, <br> LASSO, <br>Ridge Regression, <br>Hyperparameter Tuning <br>(Grid Search/Heuristic Search), <br> ElasticNet|[Lecture2](http://47.94.6.102/NLPCamp6/course-info/blob/master/课件/nlp%206%20%200229Lecture2.pptx)|[Matrix Cookbook](https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf)<br/>[ElasticNet](https://web.stanford.edu/~hastie/Papers/elasticnet.pdf)<br/>[https://arxiv.org/pdf/1805.02408.pdf](https://arxiv.org/pdf/1805.02408.pdf)<br/>[Deep Computational Phenotyping](http://www-scf.usc.edu/~zche/papers/kdd2015.pdf)<br/>[Label Embedding using Hierarchical Structure of Labels for Twitter Classification](https://www.aclweb.org/anthology/D19-1660.pdf)<br/>[ElasticNet](https://web.stanford.edu/~hastie/Papers/elasticnet.pdf)<br/>|||
| 3月1日 (周日) 11:00AM | (直播-Discussion) <br>经典数据结构与算法 |Divide and Conquer技术以及应用|[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/review1-divide%20and%20conquer%20-%20v2.pptx)||||
| 3月1日 (周日) 3:30PM | (直播-Discussion) <br>经典数据结构与算法 |哈希表,搜索树,堆(优先堆)|[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/review_1_hashtable_tree_heap.pptx)||||
| 3月7日 (周六) 10:00AM | (直播-Lecture3) <br>XGBoost | XGBoost核心算法讲解|[Lecture3](http://47.94.6.102/NLPCamp6/course-info/tree/master/%E8%AF%BE%E4%BB%B6/0307)|||
| 3月8日 (周日) 11:00AM | (直播-Paper) <br>第一篇论文讲解<br>Searching and Mining Trillions of Time Series Subsequences under Dynamic Time Warping ||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0308%E8%AE%BA%E6%96%871.pdf)|||[第二篇论文阅读](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/%E8%AE%BA%E6%96%872.From%20Word%20Embeddings%20To%20Document%20Distances.pdf)截止日期:3月15日(周日)北京时间 23:59PM,填写到<br>[论文summary](https://shimo.im/sheets/JVp8GYP9G8TjhjyG/ )|
| 3月8日 (周日) 3:30PM | (直播-Discussion) <br>Ensemble模型实战|GBDT, XGBoost, LightGBM|[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0308nlp6Ensemble%E5%AE%9E%E6%88%98-%E9%9B%86%E6%88%90%E5%AD%A6%E4%B9%A0.pptx)|||
| 3月8日 (周日) 5:00PM | (直播-Discussion) <br>机器学习回顾(1) | 决策树,随机森林 |[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0308%E6%9C%BA%E5%99%A8%E5%9B%9E%E9%A1%BE%EF%BC%881%EF%BC%89%E4%B9%8B%E5%86%B3%E7%AD%96%E6%A0%91%E5%92%8C%E9%9A%8F%E6%9C%BA%E6%A3%AE%E6%9E%97.pptx)||||
| 3月14日(周六) 10:00AM | (直播-Lecture4) <br>凸优化(1)| 凸集, <br>凸函数、 <br>判定凸函数、 <br>LP Problem, <br>QP Problem, <br>非凸函数问题|[Lecture4](http://47.94.6.102/NLPCamp6/course-info/blob/master/课件/NLP60314.pptx)||[如何上传小作业](http://47.94.6.102/NLPCamp6/course-info/wikis/%E5%A6%82%E4%BD%95%E5%86%99%E5%B0%8F%E4%BD%9C%E4%B8%9A)|[小作业1](http://47.94.6.102/NLPCamp6/course-info/tree/master/homework1)Out 截止日期:3月22日(周日)北京时间 23:59PM, 上传到gitlab|
| 3月15日(周日) 11:00AM | (直播-Paper) <br>第二篇论文讲解<br>From Word Embeddings To Document Distances||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0318/%20paper2.pptx)|||
| 3月15日(周日) 3:30PM | (直播-Discussion) <br>生活中的优化问题 | 案例讲解 |[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0315%E4%BC%98%E5%8C%96%E9%97%AE%E9%A2%98.pptx)||||
| 3月15日(周日) 5:00PM | (直播-Discussion) <br>机器学习回顾(2) | Linear SVM讲解 |[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0315%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0%E5%9B%9E%E9%A1%BE%EF%BC%882%EF%BC%89Linear%20SVM.pptx)||||
| 3月21日(周六) 10:00AM | (直播-Lecture5) <br>凸优化(2)| Lagrangian Duality, <br>KKT条件, <br>Complementary Slackness, <br>Non-linear SVM|[Lecture5](http://47.94.6.102/NLPCamp6/course-info/blob/master/课件/0321郑老师slide_note%20-%20clear.pptx)|||
| 3月22日(周日) 3:30PM | (直播-Discussion) <br>Kernels | 核函数、 <br>Mercer's Theorem |[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/课件/nlp6%200322王老师核函数.pptx)||||
| 3月22日(周日) 5:00PM | (直播-Discussion) <br>Duality | LP以及它的Dual, <br>QP以及它的Dual |[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/课件/0322%20%20LP,%20DP%20and%20duality.pptx)||||
| PART 1 自然语言处理基础|
| 4月4日(周六) 10:00AM | (直播-Lecture6) <br>文本表示|分词<br>拼写纠错<br>停用词过滤<br>词的标准化<br>词袋模型<br>文本相似度计算<br>词向量<br>句子向量<br>语言模型|[Lecture6](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0404%E6%96%87%E6%9C%AC%E8%A1%A8%E7%A4%BA%E9%83%91%E8%80%81%E5%B8%88.pptx)||||
| 4月5日(周日) 11:00AM | (直播-Paper) <br>第三篇论文讲解<br>XGBoost: A Scalable Tree Boosting System||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0405review3%20-%20paper3(1).pptx)|||
| 4月5日(周日) 3:30PM | (直播-Discussion) <br>各类文本相似度计算技术Survey| 短文本<br>长文本 |[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0405%E6%96%87%E6%9C%AC%E7%9B%B8%E4%BC%BC%E5%BA%A6%E9%9F%A9%E8%80%81%E5%B8%88.pptx)||||
| 4月11日(周六) 10:30AM | (直播-Lecture7) | <br>SkipGram(重点讲解), CBOW, Glove, MF,Gaussian Embedding, 语言模型以及各类Smooting技术|[Lecture7](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0411SkipGram%E7%AD%89.pptx)|||[project1](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/Project1-master-5db594a1ca8abe8d7c541c2cce831979640929fc.zip)截止日期:4月19日(周日)北京时间 23:59PM, 上传到gitlab|
| 4月11日(周六) 7:00PM | (直播-Discussion) <br>SkipGram源代码解读|包括Huffman树等优化细节|[课件](http://47.94.6.102/NLPCamp6/course-info/tree/master/%E8%AF%BE%E4%BB%B6/0411Skip-gram%E6%BA%90%E7%A0%81%E8%AE%B2%E8%A7%A3)||||
| 4月12日(周日) 5:00PM | (直播-Discussion) <br>第一次作业讲解||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/Homework1_reference.zip)||||
| 4月16日(周四) 8:00PM | (直播-Paper) <br>第四篇论文讲解<br>Mining and Summarizing Customer Reviews||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0416%20%E7%AC%AC%E5%9B%9B%E7%AF%87%E8%AE%BA%E6%96%87.pptx)|||
| 4月18日(周六) 10:30AM | (直播-Lecture8) <br>EM算法和HMM |Em算法<br>Em收敛性<br>Em算法在高斯混合模型中的应用<br>HMM算法的引入<br>HMM中的概率计算<br>HMM中的学习(Baum-Welch算法)<br>HMM的预测问题(Viterbi算法)|[Lecture8](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0418EM,HMM%E9%83%91%E8%80%81%E5%B8%88.pptx)||||
| 4月19日(周日) 11:00AM | (直播-Paper) <br>第五篇论文讲解<br>Reading Wikipedia to Answer Open-Domain Questions||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0419%20%E7%AC%AC%E4%BA%94%E7%AF%87%E8%AE%BA%E6%96%87.pptx)|||
| 4月19日(周日) 3:30PM | (直播-Discussion) <br> 基于HMM的词性分析(POS tagger)||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0419%20hmm%E8%AF%8D%E6%80%A7%E6%A0%87%E6%B3%A8.zip)||||
| 4月19日(周日) 5:00PM | (直播-Discussion) <br>不同语言模型smoothing技术||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0419%20%E4%B8%8D%E5%90%8C%E8%AF%AD%E8%A8%80%E6%A8%A1%E5%9E%8Bsmoothing%E6%8A%80%E6%9C%AF(1).pdf)||||
| 4月25日(周六) 10:30AM | (直播-Lecture9) <br>CRF模型 ||[Lecture9](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0425%20%20%20CRF%E6%A8%A1%E5%9E%8B.pptx)||||
| 4月25日(周六) 3:30PM | (直播-Discussion) <br>基于HMM的中文分词: jieba分词原理讲解||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/HMM.pptx)|||
| 4月25日(周六) 5:00PM | (直播-Discussion) <br> 基于LSTM-CRF的命名实体识别实战||[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0425%E5%9F%BA%E4%BA%8ELSTM-CRF%E7%9A%84%E5%91%BD%E5%90%8D%E5%AE%9E%E4%BD%93%E8%AF%86%E5%88%AB%E5%AE%9E%E6%88%98.rar)||||
| 5月10日(周日) 5:00PM | (直播-Discussion) <br>项目一讲解||[课件<br>代码](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0510project1%E4%BB%A3%E7%A0%81%E5%8F%8A%E5%88%86%E4%BA%AB.rar)||||
|PART 2 深度学习与预训练模型|
| 5月16日(周六) 10:30AM | (直播-Lecture10) <br> CRF模型(2),深度学习基础|||[Log-linear models </br>and conditional </br>random fields</br>](http://cseweb.ucsd.edu/~elkan/250B/CRFs.pdf)</br>[An Introduction </br>to Conditional</br> Random Fields</br>](http://homepages.inf.ed.ac.uk/csutton/publications/crftut-fnt.pdf)</br>[Log-Linear Models and Conditional</br> Random Field</br>](http://cseweb.ucsd.edu/~elkan/250Bfall2007/loglinear.pdf)</br>[Generative Learning algorithms</br>](http://cs229.stanford.edu/notes/cs229-notes2.pdf),[网址](http://videolectures.net/cikm08_elkan_llmacrf/)|||
| 5月16日(周六) 7:00PM | (直播-Paper) <br>第六篇论文讲解 <br>GloVe: Global Vectors for Word Representation||[]()|[论文原文](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/%E9%98%85%E8%AF%BB%E8%B5%84%E6%96%99%E5%8F%8A%E8%AE%BA%E6%96%87/6.GloVe-%20Global%20Vectors%20for%20Word%20Representation.pdf)|||
| 5月17日(周日) 11:00AM | (直播-Paper) <br>第七篇论文讲解 <br>Representation Learning: A Review and New Perspectives||[]()|[论文原文](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/%E9%98%85%E8%AF%BB%E8%B5%84%E6%96%99%E5%8F%8A%E8%AE%BA%E6%96%87/7.pdf)|||
| 5月17日(周日) 3:30PM | (直播-Discussion) <br> GPU的使用与环境搭建 + 基于pytorch的简单的神经网络搭建||[]()||||
| 5月17日(周日) 5:00PM | (直播-Discussion) <br> Pytorch讲解||[]()||||
| 5月23日(周六) 10:30AM | (直播-Lecture11) <br> RNN, LSTM,梯度问题 ||[]()|||
| TBD | (直播-Discussion) ||[]()||||
| TBD | (直播-Discussion) ||[]()||||
| TBD | (直播-Paper) ||[]()||||
| 5月30日(周六) 10:30AM | (直播-Lecture12) <br> Seq2Seq, Attention, Pointer Network ||[]()|||
| TBD | (直播-Discussion) ||[]()||||
| TBD | (直播-Discussion) ||[]()||||
| TBD | (直播-Paper) ||[]()||||
| 6月6日(周六) 10:30AM | (直播-Lecture13) <br> Transformer, BERT |||||
| TBD | (直播-Discussion) ||||||
| TBD | (直播-Discussion) ||||||
| TBD | (直播-Paper) ||||||
| 6月13日(周六) 10:30AM | (直播-Lecture14) <br> GPT, XLNet |||||
| TBD | (直播-Discussion) ||||||
| TBD | (直播-Discussion) ||||||
| TBD | (直播-Paper) ||||||
| 6月20日(周六) 10:30AM | (直播-Lecture15) <br> Graph Neural Network, Graph Attention |||||
\ No newline at end of file
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment