Integrative teaching
Stefano Marmi
Examination procedure
<p>oral exam</p>
Prerequisites
An elementary knowledge of probability theory is required.The course is suitable for third year mathematics and physics students. It is also to a large extent accessible to interested second year students. The content of the course is also suitable for interested master's degree students and postgraduates.
Syllabus
Shannon entropy. Relative entropy. Mutual information.Asymptotic equipartition property. Data compression. Lempel-Ziv algorithm. Entropy rates of a stochastic process. Markov chains, random walk on a graph.Data compression, Kraft inequality, Huffman codes, Shannon coding.Prediction, entropy and gambling: Kelly's criterion, horse races. Entropy of languages. Gambling estimate of the entropy of English.Information theory, coding, data compression and prediction.Channel capacity. Kolmogorov complexity and entropy. Randomness and pseudorandomness. Entropy and dynamical systems:Topological entropy. Kolmogorov-Sinai entropy. Bernoulli schemes. Topological and measurable Markov chains. Perron-Frobenius theorem. Google page-rank algorithm.
Bibliographical references
Cover-Thomas: Elements of Information Theory 2nd edition Wiley (2006) Mennucci-Mitter: Probabilità e informazione 2nd edition Edizioni della Normale (2008) Mackay: Information Theory, Inference, and Learning Algorithms Cambridge University Press (2003) Further bibliographic references will be given during the lessons.