- ```直播-Lecture```: 核心的知识点 - ```直播-Discussion```: 代码实战、复习课、论文讲解、主题分享等 | 日期 | 主题 | 知识点详情 | 课件 | 相关阅读 | 其 他 | 作业 | |---------|---------|---------|---------|---------|---------|---------| | PART 0 前期基础复习(机器学习与凸优化)| | 2月22日(周六) 10:00AM | (直播-Lecture) <br>算法复杂度,动态规划,DTW | <br>时间/空间复杂度的分析、 <br>Master's Theorem(主定理), <br> v递归程序的时间复杂度 <br>与空间复杂度、 <br>动态规划算法介绍、 <br>编辑距离的计算、 <br>DTW的技术与应用, <br>Hamming Distance以及 <br>Semantic Hashing |[Lecture1](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/slide-clear%20%200222Lecture1.pptx) | [[博客]十分钟搞定时间复杂度](https://www.jianshu.com/p/f4cca5ce055a)<br/>[[博客] Dynamic Programming – Edit Distance Problem](https://algorithms.tutorialhorizon.com/dynamic-programming-edit-distance-problem/)<br/>[[材料]Master's Theorem](http://people.csail.mit.edu/thies/6.046-web/master.pdf)<br/>[Introduction to Algorithm (MIT Press)](http://ressources.unisciel.fr/algoprog/s00aaroot/aa00module1/res/%5BCormen-AL2011%5DIntroduction_To_Algorithms-A3.pdf)<br/>||| | 2月29日(周六) 10:00AM |(直播-Lecture) <br>逻辑回归与正则|逻辑回归模型 <br>,GD, SGD,Distributed SGD, <br>过拟合与正则, <br>L1,L2, <br> LASSO, <br>Ridge Regression, <br>Hyperparameter Tuning <br>(Grid Search/Heuristic Search), <br> ElasticNet|[Lecture2](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/NLP60229.zip)|[Matrix Cookbook](https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf)<br/>[ElasticNet](https://web.stanford.edu/~hastie/Papers/elasticnet.pdf)<br/>[https://arxiv.org/pdf/1805.02408.pdf](https://arxiv.org/pdf/1805.02408.pdf)<br/>[Deep Computational Phenotyping](http://www-scf.usc.edu/~zche/papers/kdd2015.pdf)<br/>[Label Embedding using Hierarchical Structure of Labels for Twitter Classification](https://www.aclweb.org/anthology/D19-1660.pdf)<br/>[ElasticNet](https://web.stanford.edu/~hastie/Papers/elasticnet.pdf)<br/>||| | 3月1日 (周日) 11:00AM | (直播-Discussion) <br>经典数据结构与算法 |Divide and Conquer技术以及应用|[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/review1-divide%20and%20conquer%20-%20v2.pptx)|||| | 3月1日 (周日) 3:30PM | (直播-Discussion) <br>经典数据结构与算法 |哈希表,搜索树,堆(优先堆)|[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/review_1_hashtable_tree_heap.pptx)|||| | 3月7日 (周六) 10:00AM | (直播-Lecture) <br>XGBoost | XGBoost核心算法讲解|[课件](http://47.94.6.102/NLPCamp6/course-info/tree/master/%E8%AF%BE%E4%BB%B6/0307)||| | 3月8日 (周日) 11:00AM | (直播-Paper) <br>第一篇论文讲解<br>Searching and Mining Trillions of Time Series Subsequences under Dynamic Time Warping |||||| | 3月8日 (周日) 3:30PM | (直播-Discussion) <br>Ensemble模型实战|GBDT, XGBoost, LightGBM|[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0308nlp6Ensemble%E5%AE%9E%E6%88%98-%E9%9B%86%E6%88%90%E5%AD%A6%E4%B9%A0.pptx)||| | 3月8日 (周日) 5:00PM | (直播-Discussion) <br>机器学习回顾(1) | 决策树,随机森林 |[课件](http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0308%E6%9C%BA%E5%99%A8%E5%9B%9E%E9%A1%BE%EF%BC%881%EF%BC%89%E4%B9%8B%E5%86%B3%E7%AD%96%E6%A0%91%E5%92%8C%E9%9A%8F%E6%9C%BA%E6%A3%AE%E6%9E%97.pptx)|||| | TBD | (直播-Paper)| <br>第二篇论文讲解||||| | TBD | (直播-Lecture) <br>凸优化(1)| 凸集, <br>凸函数、 <br>判定凸函数、 <br>LP Problem, <br>QP Problem, <br>非凸函数问题|||| | TBD | (直播-Discussion) <br>机器学习回顾(2) | Linear SVM讲解 ||||| | TBD | (直播-Discussion) <br>生活中的优化问题 | 案例讲解 ||||| | TBD | (直播-Lecture) <br>凸优化(2)| Lagrangian Duality, <br>KKT条件, <br>Complementary Slackness, <br>Non-linear SVM|||| | TBD | (直播-Discussion) <br>Duality | LP以及它的Dual, <br>QP以及它的Dual ||||| | TBD | (直播-Discussion) <br>Kernels | 核函数、 <br>Mercer's Theorem ||||| | PART 1 自然语言处理基础|