University of Helsinki homepageSuomeksiPå svenskaIn English
University of Helsinki Department of Computer Science
 

Department of Computer Science

582636 Probabilistic Models (4 cr) / Todennäköisyysmallit (4 op), Spring 2010

N.B. Lectures in English this year!

Lectures

19.01.-25.02.: Tue, Thu 16-18 in B222
Exercises: Fri 15-17 in B222, starting 29.01

Course instructor: Dr. Huizhen Janey Yu
Office hour: Mon 13-14 or by appointment

Introduction

This is a new course belonging to the new Algorithms and Machine Learning sub-pogramme in the Master's programme of the department, and together with 582637 Project in probability models (2 cr), it forms one of the three optional courses of the sub-programme.

For students in the old Intelligent Systems specialisation area: this course replaces, together with the project work 582637 Project in Probabilistic Models (2 cr), the course Three Concepts: Probability (6 cr).

Course Description

This course provides an introduction to probabilistic modeling with emphasis on graphical models and their applications in Artificial Intelligence, Computational Intelligence and Data Mining. The first part of the course introduces basic concepts of graphical models and Bayesian inference. Topics will include Markov models, Markov random fields, and simple Bayesian networks, with illustrating examples from machine learning or data mining. The second part of the course introduces further theoretical and algorithmic topics on graphical models. The focus will be on Bayesian networks with discrete variables, and the topics will include conditional independence and Markov properties, efficient inference algorithms, and their connection with graph theory.

Exercises

There will be exercise groups starting from the second week of the class. Exercise times are Fri 15-17.

In the next period there will be a separate project work course 582637 Project in Probabilistic Models/Todennäköisyysmallien harjoitustyö (2 cr) with more involved hands-on empirical work on the subject.

Please write me if you would like to have your exercises back before the exam to help preparing for it.

Prerequisites

The course is an introductory course, and only elementary knowledge of probability theory is required. Different parts of the course, however, have different requirements with respect to the mathematical machinery needed to apply the concepts in question. Typically some analysis and elementary mathematical statistics is required. We assume that the participants are familiar with topics covered in the courses 582630 Design and analysis of algorithms (4 cr) and 582631 Introduction to machine learning (4 cr).

Course schedule

Material

The primary material will be lecture slides and chapters from certain books. Lecture slides in pdf are given below, with the 4-slides-per-page versions for printing given inside the parentheses. (You may also check out the materials of the previous Three concepts: probability course in 2009, 2008 and 2007)

 

Additional recommended reading materials (for subjects discussed in the classes or to be discussed soon; more will be added):

  • Alan Hàjek. Interpretations of Probability, from Stanford Encyclopedia of Philosophy.
  • Frank P. Ramsey. Truth and Probability, 1926.
  • Judea Pearl. Probabilistic Reasoning in Intelligent Systems, Morgan Kaufmann, 1988. Chap. 1; Chap. 4, 5.
  • Robert G. Cowell, A. Philip Dawid, Steffen L. Lauritzen, and David J. Spiegelhalter. Probabilistic Networks and Expert Systems: Exact Computational Methods for Bayesian Networks, Springer, 2007. Chap. 2; Chap. 5.1, 5.2, 5.3; Chap. 6.
  • A. C. Davison. Statistical Models, Cambridge Univ. Press, 2003. Chap. 6.1, 6.2; Chap. 4.1, 4.7.
  • Finn V. Jensen. An Introduction to Bayesian Networks. UCL Press, 1996. Chap. 3.; Chap. 4.
  • A. Philip Dawid. Conditional Independence, in Encyclopedia of Statistical Sciences, pp. 146-55, Wiley-Interscience, 1998.

 

Course exam

05.03, Fri 9-12 in Exactum Auditorium A111.


Huizhen Janey Yu