3 Concepts: Information
Posters
A part of the course (30 % of the grade) is to prepare a poster
presentation for the joint poster session. The posters will be
prepared individually or in pairs .
A poster session is an occasion where you present your topic to the
public with the help of posters on a bulletin board. Your role is to
stand nearby your poster, explain details and answer questions to the
public. Because everyone is doing this at the same time, you do not
need to give a lecture, and the public can wander around the poster
session room and concentrate on posters whose topics interest
them. The session is meant to be relaxed and rather informal.
Guidelines for poster presentation
Some guidelines for designing the poster can be found from the
following addresses:
» Henry's slides on posters
(Post Script)
» Poster preparation guidelines in Chemistry
» SIAM guidelines for preparing posters
» ... and many, many more from different
search engines.
The poster area is 95 cm x 115 cm. Copies of the poster material
should be delivered to the instructors after the poster session.
The poster session will be held at the Course Seminar at the
end of the course. Participation to this seminar is necessary in
order to pass the course.
Poster topics
You may choose freely a poster topic for yourself
from the following list. The topics that have been already assigned
have the name(s) of the presentator(s) after the topic. In case there
is only one name, please contact either Tomi or Petri or the
person in question directly in order to find out whether it is possible
to prepare the poster in pairs.
The topics are not ordered by their
difficultness, for example - they are in random order.
If needed, more topics will be added after the course has started.
Note: You may also do the poster from a topic of your own.
Ask Petri whether your topic needs improving or whether it's ok.
- Kolmogorov Complexity
- Presenters: Merivuori & Saaristo
- Source material:
Li M., Vitanyi P.,
An Introduction to Kolmogorov Complexity and its Applications.
Springer-Verlag, New York, 2nd Edition, 1997, Ch. 2.
-
Algorithmic Statistics
- Presenters:
- Source material: Gacs P., Tromp J., Vitanyi P.,
Algorithmic Statistics,
IEEE Transactions on Information Theory, 47, 2001, pp. 2443-2463.
-
Quantum Algorithmic Information Theory
-
Randomness and Mathematical Proof
- Presenters:
- Source material: Chaitin G. J.,
Randomness and Mathematical Proof.
Scientific American, 232, May 1975, pp. 47-52.
In Chaitin G. J.,
Information Randomness and Incompleteness.
Papers on Algorithmich Information Theory. World Scientific Publishing
Co. Pte. Ltd., Singapore, 1987, pp. 3-13.
-
Predictive
Minimum Description Length Principle
- Presenters:
- Source material: Kuusisto S.,
Application of the PMDL Principle to the Induction of Classification
Trees.
Tampere University of Technology, Publications 233, 1998, Ch. 3.
-
Minimum
Message Length Principle
- Presenters: Heilimo & Nieminen
- Source material:
Baxter R.,
Minimum Message Length Inductive Inference: Theory and
Applications,
Ph.D. Thesis, Department of Computer Science, Monash University, 1996,
Ch. 1.
-
Adaptive Dictionary Encoders: Ziv-Lempel Coding
- Presenters: Halmetoja & Junttila
- Source material: Bell T., Cleary J., Witten I.,
Text Compression.
Prentice Hall, New Jersey, 1990, Ch. 8.3.
-
Dynamic
Huffman Codes
- Presenters:
- Source material: Vitter J. S.,
Dynamic Huffman coding,
ACM Trans. Mathematical Software, and Collected Algorithms of ACM.
-
"Bits-back" Encoding
- Presenters: Konttori & Nyberg
- Source material:
- Hinton G. E. and Zemel R. S.
Autoencoders, Minimum Description Length, and Helmholtz Free Energy.
Advances in Neural Information Processing Systems 6.
Cowan J. D., Tesauro G. and Alspector J. (Eds.),
Morgan Kaufmann, San Mateo, CA, 1994.
- Zemel R. S. and Hinton G. E.,
Learning Population Codes by Minimizing Description Length.
Neural Computation, 7, 1995, pp. 549-564.
- Hinton G. E., Dayan P., Frey B. J. and Neal R.,
The wake-sleep algorithm for unsupervised Neural Networks.
Science, 268, 1995, pp. 1158-1161.
- Dayan P., Hinton G. E., Neal R., and Zemel R. S.,
The Helmholtz Machine.
Neural Computation, 7, 1995, pp. 1022-1037.
-
Constrained Maximum Entropy Tomography
- Presenters:
- Source material:
-
Gene Expression Data Classification via MDL
- Presenters:
- Source material: Rebecka Jörnsten and Bin Yu,
Simultaneous Gene Clustering and Subset Selection for
Classification via MDL,
submitted, available at
http://www.stat.rutgers.edu/~rebecka/
-
Information Bottleneck
- Turbo codes
- Sequential Decision Theory and Algorithmic Information theory
- Presenter: Etelävuori
- Source material:
- Predictive methods for textual input
- Presenters: Ahosola & Kataja
- Source material:
- Shannon, C. E. (1951). Prediction and entropy of printed English. Bell System Technical Journal, 30, 51-64.
- Moffat, A.; Implementing the PPM data compression scheme
Communications, IEEE Transactions on , Volume: 38 , Issue: 11 , Nov. 1990
Pages:1917 - 1921
- Dasher - a Data Entry Interface Using Continuous Gestures and Language Models.
by David J Ward, Alan F Blackwell and David J C MacKay
In proceedings UIST 2000.
- MacKenzie, I. S., & Soukoreff, R. W. (2002).
Text entry for mobile computing: Models and methods, theory and practice.
Human-Computer Interaction, 17, 147-198.
- Normalized Compression Distance / Normalized Google Distance
- Presenters:
- Source material: