Teoria dell'informazione
Prerequisites
An elementary knowledge of probability theory is required.
The course is suitable for third year math and physics students, but is absolutely accessible (at least in large part) to interested second year students, and presumably also useful for older students and interested PhD students.
Programme
Shannon entropy. Relative entropy. Mutual information.
Asymptotic equipartition property. Data compression. Lempel-Ziv algorithm. Entropy rates of a stochastic process.
Markov chains, random walk on a graph.
Data compression, Kraft inequality, Huffman codes, Shannon coding.
Prediction, entropy and gambling: Kelly's criterion, horse races. Entropy of languages. Gambling estimate of the entropy of English.Information theory, coding, data compression and prediction.
Channel capacity.
Kolmogorov complexity and entropy. Randomness and pseudorandomness.
Entropy and dynamical systems:
Topological entropy. Kolmogorov-Sinai entropy. Bernoulli
schemes. Topological and measurable Markov chains.
Perron-Frobenius theorem. Google page-rank algorithm.
Educational aims
The objective of the course is to introduce the fundamental concepts
of information theory.
Bibliographical references
Cover-Thomas: Elements of Information Theory 2nd edition Wiley (2006)
Mennucci-Mitter: Probabilità e informazione 2nd edition Edizioni della Normale (2008)
Mackay: Information Theory, Inference, and Learning Algorithms Cambridge University Press (2003)
Further bibliographic references will be given during the lessons.