Unsupervised Machine Learning

582638
5
Algoritmit ja koneoppiminen
Syventävät opinnot
Unsupervised learning is one of the main streams of machine learning, and closely related to multivariate statistics and data mining. This course describes some of the main methods in unsupervised learning, such as principal and independent component analysis, clustering, and nonlinear dimension reduction methods. In recent years, machine learning has become heavily dependent on statistical theory which is why this course is somewhere on the borderline between statistics and computer science. Emphasis is put both on the statistical/probabilistic formulation of the methods as well as on their computational implementation. The course is intended to CS students in the algorithms and machine learning specialisation, to statistics students, and to mathematics students in the statistical machine learning specialisation.

Koe

08.05.2012 16.00 B123
Vuosi Lukukausi Päivämäärä Periodi Kieli Vastuuhenkilö
2012 kevät 13.03-27.04. 4-4 Englanti Aapo Hyvärinen

Luennot

Aika Huone Luennoija Päivämäärä
Ti 14-16 C222 Aapo Hyvärinen 13.03.2012-27.04.2012
To 14-16 C222 Aapo Hyvärinen 13.03.2012-27.04.2012
Pe 14-16 C222 Aapo Hyvärinen 13.03.2012-27.04.2012

Ilmoittautuminen tälle kurssille alkaa tiistaina 21.2. klo 9.00.

Registration for this course starts on Tuesday 21st of February at 9.00.

Information for international students

The course will be completely in English.

Yleistä

 

Target audience

Master's students in computer science (specialization in algorithms & machine learning, or bioinformatics), applied mathematics (specialization statistical machine learning or e.g. stochastics), or statistics.

Description

Unsupervised learning is one of the main streams of machine learning, and closely related to exploratory data analysis and data mining. This course describes some of the main methods in unsupervised learning.

In recent years, machine learning has become heavily dependent on statistical theory which is why this course is somewhere on the borderline between statistics and computer science. Emphasis is put both on the statistical formulation of the methods as well as on their computational implementation. The goal is not only to introduce the methods on a theoretical level but also to show how they can be implemented in scientific computing environments such as Matlab or R.

Computer projects are an important part of the course, but they are given separate credits, see Projects for Unsupervised Machine Learning. The projects will be given in the exercice session marked below in the schedule.

One of the weekly sessions (usually Friday) will be an exercice session, detailed timetable will be as follows: 

Tue 13 Mar Lecture * Thu 15 Mar Lecture * Fri 16 Mar Lecture
Tue 20 Mar Lecture * Thu 22 Mar Lecture * Fri 23 Mar

Exercices

Intro to computer assignments

Tue 27 Mar Exercices * Thu 29 Mar Lecture * Fri 30 Mar Exercices
Tue  3 Apr Lecture * Thu 5 Apr Easter break * Fri 6 Apr Easter break
Tue 10 Apr Easter break * Thu 12 Apr Lecture * Fri 13 Apr Exercices
Tue 17 Apr Lecture * Thu 19 Apr Lecture * Fri 20 Apr Exercices
Tue 24 Apr Lecture
* Thu 26 Apr Lecture * Fri 27 Apr Exercices

The exercices will be taught by Jouni Puuronen and the computer projects by Jukka-Pekka Kauppi.

 

Prerequisites

  • Statistics majors: Bachelor's degree recommended.
  • Mathematics majors: Bachelor's degree recommended. It should include basic courses in analysis (including vector analysis), linear algebra I&II, introduction to probability, introduction to statistical inference. (Preferably also some more statistics courses.)
  • Computer science majors: Bachelor's degree recommended. It should include the mathematics courses listed above for mathematic majors. Preferably you should also have done both the courses "Introduction to machine learning" and "Probabilistic models".

 

Contents:

  • Introduction
    • supervised vs. unsupervised learning
    • applications of unsupervised learning
    • probabilistic formulation: generative models or latent variable models
    • overview of the topics below
    • Review of some basic mathematics (linear algebra, probability)
  • Numerical optimization
    • gradient method, Newton's method, stochastic gradient, alternating variables
  • Principal component analysis and factor analysis
    • formulation as minimization of reconstruction error or maximization of component variance
    • computation using covariance matrix and its eigen-value decomposition
    • factor analysis and interpretation of PCA as estimation of gaussian generative model
    • factor rotations
  • Independent component analysis
    • problem of blind source separation, why non-gaussianity is needed for identifiability
    • correlation vs. independence
    • ICA as maximization of non-gaussianity, measurement of non-Gaussianity by cumulants
    • likelihood of the model and maximum likelihood estimation
    • implementation by gradient methods and FastICA
  • Clustering
    • k-means algorithm
    • formulation as mixture of gaussians
    • maximization of likelihood, EM algorithm
  • Nonlinear dimension reduction
    • non-metric multi-dimensional scaling and related methods: kernel PCA, Laplacian eigenmaps, IsoMap
    • Kohonen's self-organizing map

Kurssin suorittaminen

There will be a single exam at the end of the course. Check the exact timetable and place on the CS dept exam page.

Active participation in the exercise sessions will give you points for the exam. See here for more information.

Kirjallisuus ja materiaali

Here are the complete lecture notes for this year's course. Just to keep search engines away, you need the login uml and password uml. There is no book for the course.

Here are the exercices considered in the sessions. Note that they are slightly modified from the ones in the lecture notes, and a subset of them.