Materials to the course of Information theory, Univ Montpellier, autumn 2019.

Program of the parts 1 and 3 of the course and the list of literature (see also a program annotated with references).

Notes on lectures 1-5.

Notes on lectures 11-13.

Homework:

  1. homework to lecture on Sep 10
  2. homework to lecture on Sep 17
  3. homework to lecture on Sep 24
  4. homework to lecture on Oct 1
  5. homework to lecture on Oct 8 (see a solution for exercise 1)
  6. homework to lecture on Dec 3
  7. homework to lecture on Dec 10
  8. more exercises for the last 3 weeks of the course
For more exercises see chapters 1, 2, 7 in Shen-Vereshchagin-Uspensky book.

Bibliography for the first 5 weeks of the course — classical information theory:

In English:

  1. T.M. Cover and J.A. Thomas. Elements of information theory. John Wiley and Sons. 2012.
  2. D MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003.

In French:

  1. Y. Ollivier. Aspects de l'entropie en mathématiques. 2002
  2. N. Sendrier. Introduction à la théorie de l'information. 2007

In Russian:

  1. Н.К. Верещагин, Е.В. Щепин. Информация, кодирование и предсказание. МЦНМО, 2012.
  2. А. Яглом, И. Яглом. Вероятность и информация. Наука, 1973.

Bibliography for the last 3 weeks of the course — Kolmogorov complexity and communication complexity:

In English:

  1. T.M. Cover and J.A. Thomas. Elements of information theory. John Wiley and Sons. 2012.
  2. A. Shen, V. Uspensky, and N. Vereshchagin. Kolmogorov complexity and algorithmic randomness. Vol. 220. American Mathematical Soc., 2017.
  3. E. Kushilevitz and N. Nisan. Communication Complexity. CambridgeUniversity Press, 1996, NY, USA.

In French:

  1. D. Durand, A. Zvonkin, Complexité de Kolmogorov.

In Russian:

  1. Н.К. Верещагин, Е.В. Щепин. Информация, кодирование и предсказание. МЦНМО, 2012.
  2. В. Успенский, А. Шень, Н. Верещагин. Колмогоровская сложность и алгоритмическая случайность. 2017.