3 Concepts: Information

Posters

A part of the course (30 % of the grade) is to prepare a poster presentation for the joint poster session. The posters will be prepared individually or in pairs .

A poster session is an occasion where you present your topic to the public with the help of posters on a bulletin board. Your role is to stand nearby your poster, explain details and answer questions to the public. Because everyone is doing this at the same time, you do not need to give a lecture, and the public can wander around the poster session room and concentrate on posters whose topics interest them. The session is meant to be relaxed and rather informal.

Guidelines for poster presentation

Some guidelines for designing the poster can be found from the following addresses:

The poster area is 95 cm x 115 cm. Copies of the poster material should be delivered to the instructors after the poster session. The poster session will be held at the Course Seminar at the end of the course. Participation to this seminar is necessary in order to pass the course.

Poster topics

You may choose freely a poster topic for yourself from the following list. The topics that have been already assigned have the name(s) of the presentator(s) after the topic. In case there is only one name, please contact either Teemu or Henry or the person in question directly in order to find out whether it is possible to prepare the poster in pairs.

The topics are not ordered by their difficultness, for example - they are in random order. If needed, more topics will be added after the course has started.

Note: You may also do the poster from a topic of your own. Ask Henry whether your topic needs improving or whether it's ok.

1) Kolmogorov Complexity (Borras Garcia, Rauhala)
Li M., Vitanyi P.,
An Introduction to Kolmogorov Complexity and its Applications.
Springer-Verlag, New York, 2nd Edition, 1997, Ch. 1.

2) Algorithmic Statistics (Päiväniemi)
Gacs P., Tromp J., Vitanyi P.,
Algorithmic Statistics,
IEEE Transactions on Information Theory, 47, 2001, pp. 2443-2463.

3) Quantum Algorithmic Information Theory (Siren)
Svozil K.,
Quantum algorithmic information theory.
Journal of Universal Computer Science, 2, 1996, pp. 311-346.

4) Randomness and Mathematical Proof (Leskelä, Linnanvuo)
Chaitin G. J.,
Randomness and Mathematical Proof.
Scientific American, 232, May 1975, pp. 47-52.
In Chaitin G. J.,
Information Randomness and Incompleteness.
Papers on Algorithmich Information Theory. World Scientific Publishing Co. Pte. Ltd., Singapore, 1987, pp. 3-13.

5) Predictive Minimum Description Length Principle
Kuusisto S.,
Application of the PMDL Principle to the Induction of Classification Trees.
Tampere University of Technology, Publications 233, 1998, Ch. 3.

6) Minimum Message Length Priciple
Baxter R.,
Minimum Message Length Inductive Inference: Theory and Applications,
Ph.D. Thesis, Department of Computer Science, Monash University, 1996, Ch. 1.

7) Adaptive Dictionary Encoders: Ziv-Lempel Coding (Aunimo)
Bell T., Cleary J., Witten I.,
Text Compression.
Prentice Hall, New Jersey, 1990, Ch. 8.3.

8) "Bits-back" Encoding (Haapasalo, Hyvärinen)
Hinton G. E. and Zemel R. S.
Autoencoders, Minimum Description Length, and Helmholtz Free Energy.
Advances in Neural Information Processing Systems 6.
Cowan J. D., Tesauro G. and Alspector J. (Eds.),
Morgan Kaufmann, San Mateo, CA, 1994.

Zemel R. S. and Hinton G. E.,
Learning Population Codes by Minimizing Description Length.
Neural Computation, 7, 1995, pp. 549-564.

Hinton G. E., Dayan P., Frey B. J. and Neal R.,
The wake-sleep algorithm for unsupervised Neural Networks.
Science, 268, 1995, pp. 1158-1161.

Dayan P., Hinton G. E., Neal R., and Zemel R. S.,
The Helmholtz Machine.
Neural Computation, 7, 1995, pp. 1022-1037

9) Information Bottleneck (Tuulos)

10) Golomb and Rice Codes and Their Applications to Coding Extremely Biased Coin Flips (Heino, Kohonen)

11) MML Based Learning of Finite Mixture Models (Ojanpää)

12) Communication Complexity (Mielikäinen)

13) Constrained Maximum Entropy Tomography (Mäkelä)
Ozan Öktem,
A short overview of the COMET (COnstrained Maximum Entropy Tomography) methodology for solving inverse problems (PDF)

Ulf Skoglund, Lars-Göran Öfverstedt, Roger M. Burnett, and Gérard Bricogne,
Maximum-Entropy Three-Dimensional Reconstruction with Deconvolution of the Contrast Transfer Function: A Test Application with Adenovirus,
Journal of Structural Biology 117(1996), 3.

14) Randomness Purification (Kääriäinen)

15) An Application of Kolmogorov Complexity to Cognitive Science (Miettinen)

16) Linear Prediction Coding (Tarvainen)

17) Gene Expression Data Classification via MDL (Pitkänen)
Rebecka Jörnsten and Bin Yu,
Simultaneous Gene Clustering and Subset Selection for Classification via MDL,
submitted, available at http://www.stat.rutgers.edu/~rebecka/

 

 3 Concepts: Information
2002