Yuan Zou defends her PhD thesis on Model Selection for Bayesian Networks and Sparse Logistic Regression on March 3rd, 2017
M.Sc. Yuan Zou will defend her doctoral thesis On Model Selection for Bayesian Networks and Sparse Logistic Regression on Friday the 3rd of March 2017 at 14 o'clock in the University of Helsinki Exactum Building. Auditorium B123 (Gustaf Hällströmin katu 2b). Her opponent is Professor Ioan Tabus (Tampere University of Technology, Finland), thesis supervisor Associate Professor Teemu Roos, and custos Professor Petri Myllymäki (University of Helsinki). The defence will be held in English.
On Model Selection for Bayesian Networks and Sparse Logistic Regression
Model selection is one of the fundamental tasks in scientific research. In this thesis, we addresses several research problems in statistical model selection, which aims to select a statistical model that fits the data best. We focus on the model selection problems in Bayesian networks and logistic regression from both theoretical and practical aspects.
We first compare different model selection criteria for learning Bayesian networks and focus on the Fisher information approximation (FIA) criterion. We describe how FIA fails when the candidate models are complex and there is only limited data available. We show that although the Bayesian information criterion (BIC) is a more coarse than FIA, it achieves better results in most of the cases.
Then, we present a method named Semstem, based on the structural expectation–maximization algorithm, for learning stemmatic trees as a special type of Bayesian networks, which model the evolutionary relationships among historical manuscripts. Semstem selects best models by the maximum likelihood criterion, which is equivalent to BIC in this case. We show that Semstem achieves results with usually higher accuracies and better interpretability than other popular methods when applied on two benchmark data sets.
Before we turn to the topic of learning another type of Bayesian networks, we start with a study on how to efficiently learn interactions among variables. To reduce the search space, we apply basis functions on the input variables and transform the original problem into a model selection problem in logistic regression. Then we can use Lasso to select a small set of effective predictors out of a large set of candidates. We show that the Lasso-based method is more robust than an earlier method under different situations.
We extend the Lasso-based method for learning Bayesian networks with local structure, i.e. regularities in conditional probability distributions. We show that our method is more suitable than some classic methods that do not consider local structure. Moreover, when the local structure is complex, our method outperforms two other methods that are also designed for learning local structure.
Availability of the dissertation
An electronic version of the doctoral dissertation is available on the e-thesis site of the University of Helsinki at http://urn.fi/URN:ISBN:978-951-51-2968-0.
Printed copies will be available on request from Yuan Zou: email@example.com.