Courses:

Statistical Learning Theory and Applications >> Content Detail



Syllabus



Syllabus



Course Description


This course focuses on the problem of supervised and unsupervised learning from the perspective of modern statistical learning theory, starting with the theory of multivariate function approximation from sparse data. It develops basic tools such as regularization, including support vector machines for regression and classification. It derives generalization bounds using both stability and VC theory. This course discusses current research topics such as boosting, feature selection, active learning, ranking, and online learning. It also examines applications in several areas: computer vision, computer graphics and bioinformatics. The final projects, hands-on applications, and exercises are designed to illustrate the rapidly increasing practical uses of the techniques described throughout the course.



Prerequisites


18.02, 9.641J, 6.893 or permission of instructor. In practice, a substantial level of mathematical maturity is necessary. Familiarity with probability and functional analysis will be very helpful. We try to keep the mathematical prerequisites to a minimum, but we will introduce complicated material at a fast pace.



Grading


There will be two problem sets, a MATLAB® assignment, and a final project. To receive credit, you must attend regularly, and put in effort on all problem sets and the project.


 








© 2010-2017 OpenHigherEd.com, All Rights Reserved.
Open Higher Ed ® is a registered trademark of AmeriCareers LLC.