2019

# Differences

This shows you the differences between two versions of the page.

 — 2019 [2019/03/01 05:52] (current)hongxin created 2019/03/01 05:52 hongxin created 2019/03/01 05:52 hongxin created Line 1: Line 1: + ====== CSMATH ​ (2018-2019) ====== + + ===== About ===== + Please see {{:​2018:​csmath-01-introduction.pdf |this file}}. + + |              ^  Course ​                       ^  Instructor ​                                                ​^ ​ Courseware ​        ​^  ​ + ^ March, 2019  |  Multivariate Analysis ​        | [[ http://​www.cad.zju.edu.cn/​home/​zhx/​ | Hongxin Zhang ]]   ​| ​ [[2018&#​courselet_on_multivariate_analysis|see below ...]]   ​| ​ + ^ April, 2019   ​| ​ Nonlinear Optimization ​       | [[ http://​www.cad.zju.edu.cn/​home/​zhx/​ | Hongxin Zhang ]]   ​| ​ [[2018#​courselet_on_optimization|see below ...]]  |  ​ + + ===== Students ===== + The course is open to Ph.D students of College of Computer Science and graduate students of related majors, Zhejiang University. ​ + + ===== Time and Place ===== + Tuesday, 18:​30-21:​30. Cao Guang Biao Building West 2-202, Yu Quan District, Zhejiang University. + + ===== Homeworks ===== + + **Deadline**:​ Please hand out your **1** [[cp:​2018|course paper]] and **5** selected homework (or exercises) by 2019-06-01. All homework and exercises must be implemented in Python and with TensorFlow (optionally) + + * [[homework:​2018-py01|Homework 01]] => Curve fitting ​ + * [[homework:​2018-py02|Homework 02]] => PCA + * [[homework:​2018-py03|Homework 03]] => 2D MOG and k-means + * [[homework:​2018-py04|Homework 04]] => L-M algorithm + * [[homework:​2018-py05|Homework 05]] => 2D SVM + and the report of homework please follow {{ :​2018:​jupyternotebook.pdf | this instruction}}. + + More optional exercises: + * [[homework:​2018-ex01|Exercise 01]] => Gaussian distribution and its conjugate prior + * [[homework:​2018-ex02|Exercise 02]] => kernel PCA + * [[homework:​2018-ex03|Exercise 03]] => RPCA + * [[homework:​2018-ex04|Exercise 04]] => ISOMAP and LLE + * ... + + + ====== Courselet on Multivariate Analysis ====== + + The study of learning from data is commercially and scientifically important. This one month short course is designed to give first year Ph.D. students a thorough grounding in the methodologies,​ technologies,​ mathematics and algorithms currently needed by people who do research in learning and data mining or who may need to apply learning or data mining techniques to a target problem. The topics of the course draw from classical statistics, from machine learning, from data mining, from Bayesian statistics and from statistical algorithmics. ​ + + Students entering the class should have a pre-existing working knowledge of probability,​ statistics and algorithms, though the class has been designed to allow students with a strong numerate background to catch up and fully participate. ​ + + ===== Schedule ===== + ^    Topic               ​^ ​  ​Date ​    ​^ ​ Slides ​                                                   ^   ​Homework ​ ^ + | Introduction ​          | 2019.02.26 | {{:​2018:​csmath-01-introduction.pdf|Introduction}} ​           |  [[homework:​2017-py01|HW01]] ​        ​|  ​ + | :::                    | :::        | {{:​2018:​csmath-01-data-driven.pdf|Why data driven}} ​         |  :::                            | + | :::                    | :::        | {{:​2018:​csmath-01-point_estimation.pdf|Point estimation}} ​ [[https://​engineering.purdue.edu/​kak/​Trinity.pdf|Additional reading]] ​ |  :::                            | + | Component Analysis ​    | 2019.03.05 | {{:​2018:​csmath-02-component_analysis.pdf|PCA and its related techniques}}| ​ [[homework:​2013-py02|HW02]] ​        | + | Distance and similarity | 2019.03.12 | {{:​2018:​csmath-03-distance_and_similarity.pdf|}} |  [[homework:​2013-py03|HW03]] ​         | + | Graphical Models ​      | 2018.03.19 | {{:​2018:​csmath-04-graphical_models_a.pdf|Graphical Models}} {{:​2015:​csmath-04-graphical_models_b.pdf|Hidden Markov Models}} ​                                                           |            | + ===== Text books ===== + * [[http://​research.microsoft.com/​en-us/​um/​people/​cmbishop/​prml/​|Pattern Recognition and Machine Learning ]] + * [[http://​www.rii.ricoh.com/​~stork/​DHS.html|Pattern Classification (2nd ed)  ]] + * [[http://​www-stat.stanford.edu/​~tibs/​ElemStatLearn/​|The Elements of Statistical Learning: Data Mining, Inference, and Prediction. ​ Second Edition, 2009.]] + + + + ===== Reference website ===== + * [[http://​www.stanford.edu/​class/​cs229/​|Stanford machine Learning course]] + + + ====== Courselet on Optimization ====== + Optimization methods, both linear and non-linear ones, are important mathematical techniques for computer science. This one month short course is designed to give first year Ph.D. students a thorough grounding in the methodologies,​ technologies,​ mathematics and algorithms currently needed by people who are doing research related to linear and non-linear optimization. The topics of the course draw mainly from linear programming,​ quadratic programming and nonlinear optimization.\\ + + Students entering the class should have a pre-existing working knowledge of fundamental mathematics and algorithms, though the class has been designed to allow students with a strong numerate background to catch up and fully participate. + ===== Schedule ===== + ^    Topic                 ​^ ​  ​Date ​    ​^ ​  ​Slides ​                          ​^ ​        ​Homework ​             ^ + | Linear programming ​      | 2019.03.26 |  {{:​2018:​csmath-05-linear_programming.pdf|Linear programming and simplex methods}} ​      ​| ​                              ​| ​ + | Linear programming ​      | 2019.04.02 |  {{:​2018:​csmath-06-linear_programming_and_dual_methods.pdf|Dual methods}} ​      ​| ​                              | + |                          |            |  {{:​svm_cjlin_dm.pdf|SVM}} ​       |                               | + | Non-linear optimization ​ | 2019.04.09 |  {{:​2018:​csmath-07-nonlinear.pdf|NP}} ​      ​| ​ [[homework:​2018-py04|HW04]] ​                        | + | Quadratic programming ​   | 2019.04.16 |  {{:​2017:​csmath-08-nonlinear_and_qp.pdf|QP}} ​       |  [[homework:​2018-py05|HW05]] ​                        | + + ===== Text books ===== + - 袁亚湘，孙文瑜. 最优化理论与方法，科学出版社. + - 张建中，许绍吉. 线性规划. 科学出版社. ​ + - 黄红选，韩继业. 数学规划. 清华大学出版社. + - Stephen Boyd and Lieven Vandenberghe. [[https://​www.stanford.edu/​~boyd/​cvxbook/​|Convex Optimization]]. Cambridge University Press. ​ + + Further reading: https://​github.com/​ChristosChristofidis/​awesome-deep-learning + + + + ====== Python ====== + + [[http://​www.python.org/​|Python]] is a powerful but easy-to-use script language for daily software development. In this course, we mainly use it as a standard training tool to temper mathematical skills. Several useful links are listed as following for your reference. + + Learning Python: + * An active Chinese python forum: [[http://​wiki.woodpecker.org.cn/​moin/​|http://​wiki.woodpecker.org.cn]]. You can find a  [[http://​wiki.woodpecker.org.cn/​moin/​March_Liu/​PyTutorial|Chinese tutorial book]] in this website. Alternatively,​ you can read the following simplified course in Chinese [[http://​woodpecker.org.cn/​abyteofpython_cn/​chinese/​]]. + * [[http://​www.sthurlow.com/​python/​|A Beginner'​s Python Tutorial]] in English + * [[http://​docs.python.org/​tutorial/​|The Python (V2.7) Tutorial]] ​ + * [[http://​www.mindview.net/​Books/​TIPython|Think in python]] + + Most useful Python packages in this course: + * Plot the results in [[http://​matplotlib.org/​|matplotlib]] and [[http://​gnuplot-py.sourceforge.net/​|gnuplot]] + * [[http://​numpy.scipy.org/​|NumPy]] is the fundamental package needed for scientific computing with Python. + * [[http://​conference.scipy.org/​scipy2010/​|SciPy]] + + Other related modules: + * [[http://​folk.uio.no/​henninri/​pca_module/​|PCA Module for Python]] + * [[http://​www-users.cs.york.ac.uk/​jc/​teaching/​agm/​|Algorithms for Graphical Models]] + ​ + ​ + ​ + ​
2019.txt · Last modified: 2019/03/01 05:52 by hongxin

### Page Tools 