Unsupervised Machine Learning
Exam
Year | Semester | Date | Period | Language | In charge |
---|---|---|---|---|---|
2011 | spring | 15.03-29.04. | 4-4 | English | Aapo Hyvärinen |
Lectures
Time | Room | Lecturer | Date |
---|---|---|---|
Tue 14-16 | C222 | Aapo Hyvärinen | 15.03.2011-29.04.2011 |
Thu 14-16 | C222 | Aapo Hyvärinen | 15.03.2011-29.04.2011 |
Fri 14-16 | C222 | Aapo Hyvärinen | 15.03.2011-29.04.2011 |
Ilmoittautuminen tälle kurssille alkaa tiistaina 22.2. klo 9.00.
Registration for this course starts on Tuesday 22nd of February at 9.00.
Information for international students
The course will be completely in English.
General
Target audience
Master's students in computer science (specialization in algorithms & machine learning, or bioinformatics), applied mathematics (specialization statistical machine learning or e.g. stochastics), or statistics.
Description
Unsupervised learning is one of the main streams of machine learning, and closely related to exploratory data analysis and data mining. This course describes some of the main methods in unsupervised learning.
In recent years, machine learning has become heavily dependent on statistical theory which is why this course is somewhere on the borderline between statistics and computer science. Emphasis is put both on the statistical formulation of the methods as well as on their computational implementation. The goal is not only to introduce the methods on a theoretical level but also to show how they can be implemented in scientific computing environments such as Matlab or R.
Computer projects are an important part of the course, but they are given separate credits, see Projects for Unsupervised Machine Learning.
One of the weekly sessions (usually Friday) will be an exercice session, detailed timetable will be as follows: [Please note changes made on 4 Apr, given in red]
Tue 15 Mar | Lecture | * | Thu 17 Mar | Lecture | * | Fri 18 Mar | Lecture |
Tue 22 Mar | Lecture | * | Thu 24 Mar | Lecture | * | Fri 25 Mar | Exercices |
Tue 29 Mar | Lecture | * | Thu 31 Mar | Lecture | * | Fri 1 Apr | Exercices |
Tue 5 Apr | Lecture | * | Thu 7 Apr | Lecture | * | Fri 8 Apr | Exercices |
Tue 12 Apr | Lecture | * | Thu 14 Apr | Lecture | * | Fri 15 Apr | Lecture |
Tue 19 Apr | Exercices | * | Thu 21 Apr | Easter break | * | Fri 22 Apr | Easter break |
Tue 26 Apr | Easter break | * | Thu 28 Apr | Exercices | * | Fri 29 Apr | Exercices |
Both the exercices and computer projects will be taught by Michael Gutmann.
Prerequisites
- Statistics majors: Bachelor's degree recommended.
- Mathematics majors: Bachelor's degree recommended. It should include basic courses in analysis (including vector analysis), linear algebra I&II, introduction to probability, introduction to statistical inference. (Preferably also some more statistics courses.)
- Computer science majors: Bachelor's degree recommended. It should include the mathematics courses listed above for mathematic majors. Preferably you should also have done both the courses "Introduction to machine learning" and "Probabilistic models".
Contents:
- Introduction
- supervised vs. unsupervised learning
- applications of unsupervised learning
- probabilistic formulation: generative models or latent variable models
- overview of the topics below
- Review of some basic mathematics (linear algebra, probability)
- Numerical optimization
- gradient method, Newton's method, stochastic gradient, alternating variables
- Principal component analysis and factor analysis
- formulation as minimization of reconstruction error or maximization of component variance
- computation using covariance matrix and its eigen-value decomposition
- factor analysis and interpretation of PCA as estimation of gaussian generative model
- factor rotations
- Independent component analysis
- problem of blind source separation, why non-gaussianity is needed for identifiability
- correlation vs. independence
- ICA as maximization of non-gaussianity, measurement of non-Gaussianity by cumulants
- likelihood of the model and maximum likelihood estimation
- information-theoretic approach, connections between different approaches
- implementation by gradient methods and FastICA
- Clustering
- k-means algorithm
- formulation as mixture of gaussians
- maximization of likelihood: alternating variables method, EM algorithm
- Nonlinear dimension reduction
- non-metric multi-dimensional scaling and related methods, e.g. kernel PCA, IsoMap
- Kohonen's self-organizing map
Completing the course
There will be a single exam at the end of the course. It is on 6th May, check the exact timetable and place on the CS dept exam page.
Active participation in the exercise sessions will give you points for the exam. See this document for detailed information.
Literature and material
The complete lecture notes. Just to keep search engines away, you need the login uml and password uml. There is no book for the course.