Advanced Course in Machine Learning

Algorithms and machine learning
Advanced studies
More detailed coverage of machine learning methods and algorithms, presented from statistical and optimization perspective. The course goes beyond Introduction to Machine Learning in both scope and level of detail, introducing techniques such as latent variable models, nonlinear dimensionality reduction, approximative Bayesian inference, and deep learning. Prerequisites: Introduction to machine learning or similar knowledge. The course is also suitable for mathematics and statistics students interested in machine learning.


11.05.2016 09.00 A111
Year Semester Date Period Language In charge
2016 spring 15.03-05.05. 4-4 English Arto Klami


Time Room Lecturer Date
Tue 10-12 D123 Arto Klami 15.03.2016-05.05.2016
Thu 12-14 D123 Arto Klami 15.03.2016-05.05.2016

Exercise groups

Group: 1
Time Room Instructor Date Observe
Wed 10-12 B221 Aditya Jitta 14.03.2016—06.05.2016

Ilmoittautuminen tälle kurssille alkaa tiistaina 16.2. klo 9.00.

Registration for this course starts on Tuesday 16th of February at 9.00.


NOTE: The material on these web pages refers to the edition of the course given in 2016, and is still valid for separate exams. The material will be revised for the edition given in Spring 2017, but the basic contents of the course will not change.


NEWS: The project work required for completing the course with the separate exam was released May 31st, 2016.


The course has a Moodle page for discussing the exercises etc.

The course is a natural contiuation for the Introduction to machine learning -course. It covers more topics and also goes deeper, discussing both theory and practice of machine learning.

The topics include (slight changes are to be expected):

  1. Machine learning in general; what it is about, what can it achieve, and what are the underlying fundamentals
  2. Probabilitistic perspective to machine learning, some Bayesian ifnerence
  3. Optimization and regularization
  4. Unsupervised learning
    1. Clustering, mixture models
    2. Linear latent variable models (PCA, ICA, etc)
    3. Non-linear dimensionality reduction
    4. Matrix factorization, recommender engines
  5. Supervised learning
    1. Regression and classification; basic principles
    2. Kernel methods and support vector machines
    3. Decision trees, ensembles and boosting
  6. Neural networks and deep learning

Completing the course

There are two alternative ways to complete the course:

  1. Solve sufficient proportion of the weekly exercises and attend the course exam (or a later separate exam)
  2. Solve a small research project and attend a separate exam

The exercise session is not obligatory. Instead, it is simply a session during which the course organizers will be available to help with the exercises.

Literature and material

The course will be primarily lectured based on the book "Machine learning: A probabilistic perspective" by Kevin P. Murphy, but most of the material can also be found in freely available sources. Links to alternative readinig sources are provided in the Lectures tab (and sometimes in the exercise problems).