582636 Probabilistic models (ohtk 25.8.2011)

Principal theme Prerequisite knowledge Approaches the learning objectives Reaches the learning objectives Deepens the learning objectives

Role of probability theory  in knowledge representation and uncertain reasoning

Basics of first-order logic and probability theory

Can explain the basic concepts like joint probability distribution, conditional distribution, Bayes rule, and conditional independence, and by using these concepts, can formulate the basic probabilistic inference problems

Can explain the meaning of a Bayesian network model as a parametric model (set of probability distributions), factorization of a joint probability distribution, and as an independence model (using d-separation, and local and global Markov properties).

Can compute conditional distributions from a fixed discrete, Naïve Bayes classifier, finite mixture model
or a Hidden Markov Model

Can implement a probabilistic inference algorithm for a fixed singly-connected graph with the parameters given

Can implement a probabilistic inference algorithm for a discrete multi-connected graph

Can justify the use of probability theory based on theoretical arguments like the Dutch book or the Cox theorem

Parameter learning and Bayesian reasoning

Introduction to Machine Learning

Can derive the maximum likelihood parameters, the maximum posterior parameters (with conjugate prior), and the expected parameters for the Multinomial distribution

Can explain the role of the parameter prior in parameter learning

Can learn a Naïve Bayes Classifier from a set of data and use the model for predictive inference

Can learn the parameters of a Bayesian network from a set of data and use the model for predictive inference
 

Can learn the parameters of continuous models

In the discrete case, can implement the EM algorithm for learning the parameters of a finite mixture model

Parametric model structure learning

Introduction to Machine Learning

Can explain the model structure learning problem and how that differs from the parameter learning problem

Can explain what over-fitting is

Can explain the concept of equivalence class and say whether two networks are equivalent or not

Knows how to compute the marginal likelihood for discrete Bayesian networks and can explain how to use that for model structure selection

Can implement an algorithm for learning a discrete Bayesian network, given data

Can derive the formula for computing the marginal likelihood

Knows other model selection criteria in addition to marginal likelihood

 

04.11.2015 - 15:15 Antti Hyttinen
24.08.2011 - 15:22 Petri Myllymäki