Information theory
Prerequisiti
An elementary knowledge of probability theory is required.The course is suitable for third year mathematics and physics students. It is also to a large extent accessible to interested second year students. The content of the course is also suitable for interested master's degree students and postgraduates.
Programma
Shannon entropy. Relative entropy. Mutual information.Asymptotic equipartition property. Data compression. Lempel-Ziv algorithm. Entropy rates of a stochastic process. Markov chains, random walk on a graph.Data compression, Kraft inequality, Huffman codes, Shannon coding.Prediction, entropy and gambling: Kelly's criterion, horse races. Entropy of languages. Gambling estimate of the entropy of English.Information theory, coding, data compression and prediction.Channel capacity. Kolmogorov complexity and entropy. Randomness and pseudorandomness. Entropy and dynamical systems:Topological entropy. Kolmogorov-Sinai entropy. Bernoulli schemes. Topological and measurable Markov chains. Perron-Frobenius theorem. Google page-rank algorithm.
Obiettivi formativi
The aim of the course is to introduce the fundamental notions of information theory and their interaction with related disciplines, in particular with ergodic theory.
Riferimenti bibliografici
Cover-Thomas: Elements of Information Theory 2nd edition Wiley (2006) Mennucci-Mitter: Probabilità e informazione 2nd edition Edizioni della Normale (2008) Mackay: Information Theory, Inference, and Learning Algorithms Cambridge University Press (2003) Further bibliographic references will be given during the lessons.
Moduli
| Modulo | Ore | CFU | Docenti |
|---|---|---|---|
| Information theory | 40 | 6 | Stefano Marmi |
| Supplementary teaching activity | 10 | 0 | Stefano Marmi |