This is a page of the online seminar on algorithmic aspects of information theory,
a follow up of Dagstuhl Seminar 22301.
If you want to add your address to the mailing list or if you want to suggest a talk please contact
us.

12 October 2022
Laszlo Csirmaz (Central European University, Budapest).
Around entropy inequalities.
Abstract:
This will be an introductory talk on linear inequalities for Shannon's entropy:
what the problem is, what the methods are, what is known and what is not known.

26 October 2022
Andrei Romashchenko (CNRS LIRMM, Montpellier).
Basic proof techniques for entropic inequalities.
Abstract:
We will consider again several classical theorems mentioned in the survey by
Laszlo Csirmaz
(the first lecture) and discuss the proof techniques behind these results.

9 November 2022
Andrei Romashchenko (CNRS LIRMM, Montpellier).
The ZhangYeung inequality: an attempt to explain.
Abstract:
We will discuss a magical proof of the first example of a nonShannon type inequality
(ZhangYeung'1998) and try to explain how this and similar arguments could be invented without magic.

23 November 2022
Laszlo Csirmaz (Central European University, Budapest). The entropy region is not polyhedra.
Abstract:
We present the two ingredients of the proof: a set of
entropy inequalities and a collection of sample distributions,
and how these yield the famous result of Fero Matus.
If time permits, we sketch the proof of the used inequalities, and
indicate the shortcomings of other applications of the same idea.

7 December 2022
Alexander Shen (CNRS LIRMM, Montpellier). Information inequalities: combinatorial interpretation.
Abstract:
Information inequalities like H(A,B) ≤ H(A)+H(B) or H(A)+H(A,B,C) ≤ H(A,B)+H(A,C)
have natural interpretation. It is enough to check them for "uniform"
random variables that can be constructed starting from a group and its
subgroups (Chan,Yeung). On the other hand, they can be translated to
Kolmogorov complexity language. Combining these tools, one can get a
combinatorial characterization of the class of information inequalities.

04 January 2023 Frederique Elise Oggier (Nanyang Technological University, Singapore).
An overview of Ingleton's inequality from a group theory point of view.
Abstract:
We will review several works that use group theory to approach Ingleton's inequality and entropic vectors.

18 January 2023 LI Cheuk Ting (Chinese University of Hong Kong).
Existential Information Inequalities, NonShannon Inequalities, and Automated Theorem Proving.
Abstract:
Existential information inequality is a generalization of linear
information inequalities, where random variables can not only be
universally quantified, but also existentially quantified. We study the
structure of existential information inequalities, and describe
algorithms for automated verification of existential information
inequalities, which are also useful for proving (nonexistential)
nonShannon inequalities. We also describe how a wide range of results
in network information theory (e.g. 32 out of 56 theorems in Chapters
114 of Network Information Theory by El Gamal and Kim) can be proved
automatically using the proposed algorithms. The algorithms are
implemented in the PSITIP framework (
github.com/cheuktingli/psitip).

01 February 2023 Alexander Kozachinskiy (Catholic University of Chile).
On the recent progress on Frankl's conjecture.
Abstract:
Frankl's conjecture states that for any nonempty family of nonempty
finite sets which is closed under finite union there exists an element
belonging to at least 1/2 of the sets from the family. I will present a
recent result of Gilmer that this conjecture holds for some positive constant.

15 February 2023 Andrei Romashchenko.
Quasiuniform distributions and the method of random covers.
Abstract:
We will talk about variations of the classical method of typical
sequence and its applications useful in studying information
inequalities. In particular, we will discuss the AhlswedeKörner lemma
in the form that can be used to prove nonShannon type inequalities for
entropy.

1 March 2023
Milan Studený (Institute of Information Theory and Automation, Prague).
On conditional Ingleton inequalities.
Abstract:
Linear information inequalities valid for entropy functions induced by
discrete random
variables play an important role in the task to characterize discrete
conditional independence
structures. Specifically, the socalled conditional Ingleton
inequalities in the case of 4 random variables are in the center of
interest: these are valid under conditional independence assumptions on
the inducing random variables. The four inequalities of this form were
earlier revealed: by Yeung and Zhang (
1997), by Matúš (
1999) and by Kaced and Romashchenko (
2013). In our recent paper (
2021)
the fifth inequality of this type was found. These five information
inequalities can be used to characterize all conditional independence
structures induced by four discrete random variables. One of open
problems in that 2021 paper was whether the list of conditional Ingleton
inequalities over 4 random variables is complete: the analysis can be
completed by a recent finding of Boege (
2022) answering that question.

15 March 2023 Vlad Demartsev (Max Planck Institute of Animal Behavior).
Economy of vocal repertoires: Zipf's laws in animal communication.
Abstract:
Zipf's Law of Brevity (negative correlation of words' length
with the frequency of their use) was found across multiple lexicons and
text corpora and often claimed to be one of unifying features of human
language. This intriguing linguistic regularity could be explained by
the Principle of Least Effort — a compromise between the need for
transferring information in a detailed and comprehensive manner and the
pressure for minimising the effort (cost) associated with producing
signals.
If Zipf's principles are indeed fundamental for communication we should
be able to find them in animal systems. Animal communication systems are
likely to be are constrained by the same fundamental tradeoff between
minimizing signalling costs while maintaining informational integrity.
However, the pressure towards optimization of signalling is not the only
force shaping animal vocal repertoires. Factors like, sexual selection,
social organization and habitat features can generate evolutionary
forces which might push vocal repertoires further away from Zipfian
optimization.

22 March 2023
Oriol Farràs (Universitat Rovira i Virgili).
Introduction to Secret Sharing Schemes: Constructions.
Abstract:
A secret sharing scheme is a method by which a dealer distributes shares
to parties such that only authorized subsets of parties can reconstruct
the secret. The family of these authorized subsets is called the access
structure of the scheme. This introductory talk is focused on the
problem of finding efficient secret sharing schemes for general access
structures. We will see some general constructions as well as their
limitations. We will review connections between the problem of finding
efficient schemes and other problems, such as finding small monotone
formulas and monotone span programs for monotone Boolean functions, and
problems of matroid theory and entropy optimization.

29 March 2023 Carles Padro (Universitat Politècnica de Catalunya).
Lower Bounds in Secret Sharing from Information Theory.
Abstract:
Information theory has been used for decades in the search for lower
bounds in secret sharing. The main achievements and limitations of that
technique are discussed in this survey talk. Among the former, Csirmaz's
general lower bound, which is still the best of the known ones, and the
findings of exact optimal information ratios for several families of
access structures. For the latter, some intuition on the limits of the
method and a comparison with the existing lower bounds for linear secret
sharing schemes.

12 April 2023
Tianren Liu (Peking University).
From Private Information Retrieval to Secret Sharing.
Abstract:
We will revisit the first general secret sharing scheme with 2^cn
complexity, where c is strictly smaller than 1. Until quite recently,
its best known complexity is about 2^n. Our work established a
connection between secret sharing and private information retrieval.
The talk is based on joint works with Vinod Vaikuntanathan and Hoeteck
Wee.

26 April 2023
Hung Q. Ngo (relationalAI).
An Information Theoretic Approach to Estimating Query Size Bounds.
Abstract:
Cardinality estimation is one of the most important problems in database management. One
aspect of cardinality estimation is to derive a good upper bound on the output size of a query,
given a statistical profile of the inputs. In recent years, a promising informationtheoretic
approach was devised to address this problem, leading to robust cardinality estimators which
are used in practice.
The information theoretic approach led to many interesting open questions surrounding
optimizing a linear function on the almostentropic or polymatroidal cones. This talk
introduces the problem, the approach, summarizes some known results, and lists open
questions.

10 May 2023
Raymond W. Yeung (The Chinese University of Hong Kong).
Entropy Inequalities: My Personal Journey.
Abstract:
Shannon's information measures, namely entropy, mutual information, and
their conditional versions, where each of them can be expressed as a
linear combination of unconditional joint entropies. These measures are
central in information theory, and constraints on these measures, in
particular in the form of inequalities, are of fundamental interest. It
is wellknown that all Shannon's information measures are nonnegative,
forming a set of inequalities on entropy. However, whether these are all
the constraints on entropy was unknown, and the problem had been rather
evasive until the introduction of the entropy function region in the
late 1990s.
This talk consists of two part. The first part is about how I became
interested in this subject in the late 1980s. It began with my PhD
thesis in which I needed to use inequalities on Shannon's information
measures (viz. information inequalities, or equivalently entropy
inequalities) to prove converse coding theorems. During that time I was
also intrigued by the underlying settheoretic structure of Shannon's
information measures, namely their representation by diagrams that
resemble the Venn diagram. In 1991 I introduced the IMeasure to
formally establish the onetoone correspondence between set theory and
Shannon's information measures. In the mid1990s, I introduced the
entropy function region Γ* that provides a geometrical interpretation of
entropy inequalities. Shortly after, Zhen Zhang and I discovered
socalled nonShannontype inequalities which are beyond the
nonnegativity of Shannon's information measures. Since then this subject
has been under active research
A byproduct of the entropy function region and the associated
geometrical interpretation is the machineproving of entropy
inequalities. In the second part of this talk, I will discuss the
research along this line, including some recent results.

24 May 2023
Mahmoud Abo Khamis (relationalAI).
The Polymatroid Bound: Extensions and Applications in Database Query Evaluation (part 1).
Abstract:
Information theory has been used to compute upper bounds on the output
sizes of database queries as well as to devise matching algorithms that
can answer these queries within these bounds. The polymatroid bound
generalizes many of these bounds and has a matching query evaluation
algorithm, called PANDA [Abo Khamis et al, PODS’17].
In this talk, we present extensions of the polymatroid bound that
utilize the query structure to derive new constraints on entropies. We
state open problems regarding the relationship between these extensions
and the original bound. On the algorithmic side, we present the PANDA
algorithm in detail and discuss its shortcomings and related open
problems.

7 June 2023
Mahmoud Abo Khamis (relationalAI).
The Polymatroid Bound: Extensions and Applications in Database Query Evaluation (part 2).
Abstract:
This is the second part of the talk, the first part of which took place
on May 24.
I continue our discussion of informationtheoretic upper bounds on the
output sizes of database queries. I will recap the polymatroid bound and
present a corresponding query evaluation algorithm, called PANDA, whose
runtime matches the polymatroid bound
[Abo Khamis et al, PODS’17].
I will also discuss this algorithm's shortcomings and related open problems.
Acquaintance with the first part of the talk is very desirable, although formally it is not required.
Bibliography:
Mahmoud Abo Khamis, Hung Q. Ngo, Dan Suciu.
What do Shannontype Inequalities, Submodular Width, and Disjunctive Datalog have to do with one another?
arXiv:1612.02503.
(Also in Proceedings of the 36th ACM SIGMODSIGACTSIGAI Symposium on Principles of Database Systems, 2017.)

21 June 2023
Alexander V. Smal
(Technion, Israel; PDMI, Russia).
Information theoretic approach to Boolean formula complexity.
Abstract:
We will discuss the informationtheoretic approach to proving lower
bounds on the formula complexity of Boolean functions. In the late
80s, Karchmer and Wigderson discovered that Boolean formulas could be
characterized using the language of communication complexity. This
observation allowed us to apply communication complexity methods to
prove lower and upper bounds on the Boolean formula complexity.
In particular, we can use informationtheoretic methods from
communication complexity to prove formula complexity lower bounds.
This talk is a brief introduction to these techniques, as well as to some
applications.

September 13, 2023
Laszlo Csirmaz (Alfréd Rényi Institute of Mathematics, Budapest and UTIA, Prague).
The power of nonShannon inequalities: a short proof of the GácsKörner theorem.
Abstract:
We present a short proof of a celebrated result of Gács and Körner
giving sufficient and necessary condition on the joint distribution of
two discrete random variables X and Y for the case when their
mutual information matches the extractable (in the limit) common
information. Our proof is based on the observation that the mere
existence of certain random variables jointly distributed with
X and Y can impose restrictions on all random variables jointly
distributed with X and Y.

September 27, 2023 Andrei Romashchenko (CNRS LIRMM, Montpellier). On the common information and the AhlswedeKörner lemma in oneshot settings.
Abstract:
The GácsKörner common information, Wyner's common information, and
other related information quantities apply by design to long series of
independent identically distributed random variables. We will talk about
a possible adaptation of these quantities to the oneshot setting, when
the random objects cannot be represented as series of i.i.d. variables,
and the usual technics of typical sequences do not apply. We will
discuss the connection of informationtheoretic profiles with
combinatorial and spectral properties of graphs.

October 11, 2023 LI Cheuk Ting (Chinese University of Hong Kong). Functional Representation Lemma and Minimum Entropy Coupling.
Abstract:
The functional representation lemma states that for two
jointlydistributed random variables X, Y, it is possible to express Y
as a function of X and another random variable Z that is independent of
X. There are two interesting extreme points. If we minimize H(YZ), the
minimum can be bounded by the "strong functional representation lemma",
and has implications in lossy compression and channel simulation. If we
instead minimize H(Z), this is equivalent to the minimum entropy
coupling problem, which concerns finding the coupling of several given
distributions that gives the smallest joint entropy. In this talk, we
will discuss several recent developments regarding these problems

October 25, 2023
Alexander Shen (CNRS LIRMM, Montpellier).
Information theoretic proofs (a survey).
Abstract:
Some theorems that have nothing to do with information theory can
be proven using an informationtheoretic argument
(entropy, or Kolmogorov complexity, or counting via compression).
For example, why are there infinitely many prime numbers?
If there were only M prime numbers, then
each integer could be encoded by the list of M exponents in its prime decomposition,
so we could specify each nbit number by only M×O(log n) bits,
a contradiction.
We will discuss several examples of proofs of that type,
including other bounds for the distribution of prime numbers, but not only them.

November 8, 2023 Batya Kenig (Technion). Approximate Implication for Probabilistic Graphical Models.
Abstract:
The graphical structure of Probabilistic Graphical Models (PGMs)
represents the conditional
independence (CI) relations that hold in the modeled distribution.
The premise of all current systemsofinference for deriving conditional
independence relations in PGMs, is that the set of CIs used for the
construction of the PGM hold exactly.
In practice, algorithms for extracting the structure of PGMs from data
discover approximate
CIs that do not hold exactly in the distribution.
In this work, we ask how the error in this set propagates to the
inferred CIs read
off the graphical structure. More precisely, what guarantee can we
provide on the inferred CI
when the set of CIs that entailed it hold only approximately?
In this talk, I will describe new positive and negative results
concerning this problem.

November 22, 2023 Mokshay Madiman (University of Delaware). Analogies between entropy, volume, and cardinality inequalities for projections.
Abstract:
It is well known that entropy inequalities are a quick way of obtaining
volume inequalities for projections of sets in a Euclidean space (or
cardinality inequalities for projections of subsets of the integer
lattice): for example, the LoomisWhitney inequality follows easily from
the classical Han’s inequality. There is also a well known connection
with integral inequalities. We will review more such analogies, as well
as their limitations. For example, we will observe that volume of
projections to coordinate subspaces is not submodular (though entropy
is), and discuss general dualities between entropy and integral
inequalities. These will give at least one set of reasonable answers to
questions raised by Alexander Shen in the AAIT Seminar on 7 December
2022
(
video).
We will also discuss some particularly useful classes of Shannontype
inequalities that may be new to the AAIT audience: these also have
applications to volume or cardinality. Most of this talk will be
tutorial: it will be based on the work of many people in the geometry,
combinatorics, probability, and information theory communities, rather
than just the work of the speaker.

December 6, 2023 Marius Zimand (Towson University).
Lossless expanders and Information Reconciliation with no shared randomness.
Abstract:
Lossless expanders are bipartite graphs in which for every sufficiently
small set on the left side of the bipartition, most of the outgoing
edges lead to distinct vertices. I will present an application of such
graphs in Information Reconciliation. This is a protocol with 2 parties:
Sender and Receiver. Sender has a string x and Receiver has a string y. For instance, think that x is an updated version of y. Sender sends a message m which allows Receiver to construct x. The goal is to minimize the length of m, taking into account the correlation of x and y.
This correlation can be formalized in different ways and is only known
by Receiver. Standard solutions require Sender and Receiver to share
randomness. In my talk, I will explain how lossless expanders are used
to obtain Information Reconciliation with no shared randomness.
The main idea is from the paper B. Bauwens and M. Zimand,
Universal almost optimal compression and SlepianWolf coding in probabilistic polynomial time, Journal of the ACM, April 2023
(also available at
arXiv:1911.04268), but I'll focus on just one particular aspect.

Jan 24 and 31, 2024
Amin Gohari (the Chinese University of Hong Kong).
Secret Key Generation from Dependent Sources.
Abstract:
We consider the problem of extracting a secret key from dependent
sources. In this problem, the legitimate terminals observe iid
repetition of correlated random variables and wish to create a shared
secret key that is secure from a passive eavesdropper using a noiseless
public communication channel. Finding the maximum achievable key rate is
a fundamental open problem in informationtheoretic security. We review
the history and the stateoftheart results for this problem, with an
emphasis on the speaker's prior work on this problem. Certain extensions
of the problem will also be discussed.

Feb 28 and Mar 13, 2024 Chao Tian (Texas A&M University).
ComputerAided Investigation of InformationTheoretic Limits: An Overview.
Abstract:
The linear programming (LP) formulation of information measures provides
a solid mathematical framework to identify the fundamental limits of
information systems computationally. A critical issue of this approach
is however its high computational complexity. To reduce the computation
burden of this approach, we can utilize the symmetry structure in such
systems. The strength of the symmetryreduced approach is illustrated in
several wellknown difficult problems, such as regenerating codes,
coded caching, and private information retrieval, which provides new and
nontrivial outer bounds. In addition to rate bounds, more indepth
studies can be conducted on the joint entropy structure of these
computed bounds, which often lead to reverseengineered novel code
constructions and further allow disproving linear code achievability.
Finally, we discuss two new directions: the first is to allow the
utilization of nonShannontype inequalities in the computational
approach, and the second is to convert the original LP into a sequence
of smaller LPs, both of which appear to be awaiting certain suitable
machinelearning techniques.

April 10, 2024 :
Alexander Kozachinskiy (Catholic University of Chile),
Informationtheoretic methods in communication complexity.
Abstract:
Imagine that Alice and Bob both hold a binary string of length n, and
they want to know if there is a position, where both their strings have
1. This problem is called Disjointness. A fundamental result in
communication complexity states that even if Alice and Bob have access
to randomness and are allowed to error with small probability, they need
to send each other Omega(n) bits to solve Disjointness.
In the talk, I will present an information complexity proof
technique that was
developed by different authors in the 2000s and by now has become
classical.
This method, in the nutshell, uses the chain rule for mutual information
to formalize an intuition that we cannot do better in solving
Disjointness than checking individually all n positions.

April 24, 2024 at 16h of Central European Summer Time :
Esther Galbrun (University of Eastern Finland, Kuopio).
The minimum description length principle for pattern mining (MDL4PM).
Abstract:
Considering that patterns express the repeated presence in the data of
particular items, attribute values or other discrete properties, mining
patterns is a core task in data analysis.
Beyond issues of efficient enumeration, the selection of patterns
constitutes a major challenge.
The MDL principle, a model selection method grounded in information
theory, has been applied to pattern mining with the aim to obtain
compact highquality sets of patterns.
After introducing some necessary concepts to formalise the pattern
mining task, we will review
MDLbased methods for mining various types of data and patterns and try
to highlight their common characteristics and differences.
Finally, we will discuss some of the issues regarding these methods, and
highlight currently active related data analysis problems.

May 15, 2024:
Lasse Harboe Wolff (University of Copenhagen).
Entropy inequalities in Quantum Information Theory.
Abstract:
Quantum information theory allows a wealth of new information processing possibilities, with the von Neumann (quantum) entropy playing a role analogous to the Shannon (classical) entropy in classical information theory. But although the von Neumann entropy has been the subject of intense study, there is essentially only one known (unconstrained) inequality governing the entropies of multiparty systems  strong subadditivity. Proved by Lieb and Ruskai [1] in 1973, all other known (unconstrained) von Neumann entropy inequalities can be derived from strong subadditivity, and the inequality is also used to derive virtually every nontrivial quantum coding theorem, along with interesting consequences for black holes, spin systems and more.
In this talk I will introduce the field of study concerned with von Neumann entropy inequalities and the search for inequalities beyond strong subadditivity, along with some (more or less) recent developments in the field [24]. The talk will be aimed towards people with limited to no experience within quantum information theory, so emphasis will be put on introducing and motivating the most relevant concepts. Whenever possible, I will also aim to highlight the similarities or differences arising between the quantum and the classical case.
Bibliography:
 Elliott H. Lieb, Mary Beth Ruskai, "Proof of the strong subadditivity of quantum‐mechanical entropy," J. Math. Phys. 1 December 1973; 14 (12): 19381941
 Linden, N., Winter, A. "A New Inequality for the von Neumann Entropy," Commun. Math. Phys. 259, 129138 (2005)
 M. Christandl, B. Durhuus, L.H. Wolff, "Tip of the Quantum Entropy Cone", Phys. Rev. Lett. 131, 240201, December 2023
 N. Pippenger, "The inequalities of quantum information theory," in IEEE Transactions on Information Theory, vol. 49, no. 4, pp. 773789, April 2003