Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
P
project2
Overview
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
20200519057
project2
Commits
5c470b8c
Commit
5c470b8c
authored
5 years ago
by
TeacherZhu
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Update README.md
parent
b0040e3a
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
3 deletions
+3
-3
README.md
+3
-3
No files found.
README.md
View file @
5c470b8c
...
...
@@ -8,12 +8,12 @@
| 2月29日(周六) 10:00AM |(直播-Lecture)
<br>
逻辑回归与正则|逻辑回归模型
<br>
,GD, SGD,Distributed SGD,
<br>
过拟合与正则,
<br>
L1,L2,
<br>
LASSO,
<br>
Ridge Regression,
<br>
Hyperparameter Tuning
<br>
(Grid Search/Heuristic Search),
<br>
ElasticNet|
[
Lecture2
](
http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/NLP60229.zip
)
|
[
Matrix Cookbook
](
https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf
)
<br/>
[
ElasticNet
](
https://web.stanford.edu/~hastie/Papers/elasticnet.pdf
)
<br/>
[
https://arxiv.org/pdf/1805.02408.pdf
](
https://arxiv.org/pdf/1805.02408.pdf
)
<br/>
[
Deep Computational Phenotyping
](
http://www-scf.usc.edu/~zche/papers/kdd2015.pdf
)
<br/>
[
Label Embedding using Hierarchical Structure of Labels for Twitter Classification
](
https://www.aclweb.org/anthology/D19-1660.pdf
)
<br/>
[
ElasticNet
](
https://web.stanford.edu/~hastie/Papers/elasticnet.pdf
)
<br/>
|||
| 3月1日 (周日) 11:00AM | (直播-Discussion)
<br>
经典数据结构与算法 |Divide and Conquer技术以及应用|
[
课件
](
http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/review1-divide%20and%20conquer%20-%20v2.pptx
)
||||
| 3月1日 (周日) 3:30PM | (直播-Discussion)
<br>
经典数据结构与算法 |哈希表,搜索树,堆(优先堆)|
[
课件
](
http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/review_1_hashtable_tree_heap.pptx
)
||||
| 3月7日 (周六) 10:00AM | (直播-Lecture)
<br>
XGBoost | XGBoost核心算法讲解|
[
课件
](
http://47.94.6.102/NLPCamp6/course-info/tree/master/%E8%AF%BE%E4%BB%B6/0307
)
|||
| 3月7日 (周六) 10:00AM | (直播-Lecture)
<br>
XGBoost | XGBoost核心算法讲解|
[
Lecture3
](
http://47.94.6.102/NLPCamp6/course-info/tree/master/%E8%AF%BE%E4%BB%B6/0307
)
|||
| 3月8日 (周日) 11:00AM | (直播-Paper)
<br>
第一篇论文讲解
<br>
Searching and Mining Trillions of Time Series Subsequences under Dynamic Time Warping ||||||
| 3月8日 (周日) 3:30PM | (直播-Discussion)
<br>
Ensemble模型实战|GBDT, XGBoost, LightGBM|
[
课件
](
http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0308nlp6Ensemble%E5%AE%9E%E6%88%98-%E9%9B%86%E6%88%90%E5%AD%A6%E4%B9%A0.pptx
)
|||
| 3月8日 (周日) 5:00PM | (直播-Discussion)
<br>
机器学习回顾(1) | 决策树,随机森林 |
[
课件
](
http://47.94.6.102/NLPCamp6/course-info/blob/master/%E8%AF%BE%E4%BB%B6/0308%E6%9C%BA%E5%99%A8%E5%9B%9E%E9%A1%BE%EF%BC%881%EF%BC%89%E4%B9%8B%E5%86%B3%E7%AD%96%E6%A0%91%E5%92%8C%E9%9A%8F%E6%9C%BA%E6%A3%AE%E6%9E%97.pptx
)
||||
| 3月14日
(周六) 10:00AM
| (直播-Lecture)
<br>
凸优化(1)| 凸集,
<br>
凸函数、
<br>
判定凸函数、
<br>
LP Problem,
<br>
QP Problem,
<br>
非凸函数问题||||
| 3月15日
(周日) 11:00AM | (直播-Paper)|
<br>
第二篇论文讲解
<br>
From Word Embeddings To Document Distances|||||
| 3月14日
(周六) 10:00AM
| (直播-Lecture)
<br>
凸优化(1)| 凸集,
<br>
凸函数、
<br>
判定凸函数、
<br>
LP Problem,
<br>
QP Problem,
<br>
非凸函数问题||||
| 3月15日
(周日) 11:00AM | (直播-Paper)
<br>
第二篇论文讲解
<br>
From Word Embeddings To Document Distances|||||
| 3月15日 (周日) 3:30PM | (直播-Discussion)
<br>
生活中的优化问题 | 案例讲解 |||||
| 3月15日 (周日) 5:00PM | (直播-Discussion)
<br>
机器学习回顾(2) | Linear SVM讲解 |||||
| TBD | (直播-Lecture)
<br>
凸优化(2)| Lagrangian Duality,
<br>
KKT条件,
<br>
Complementary Slackness,
<br>
Non-linear SVM||||
...
...
This diff is collapsed.
Click to expand it.
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment